Skip to content

This will be a demonstration session in our company

Notifications You must be signed in to change notification settings

Srjnnnn/LLM-Project

Repository files navigation

LLM-Project

The first content gpt2.py and inference.pytogether is a test project that follows Andrej Karpathy's GPT2 repreduction video lesson with some changes. It uses a small portion of fineweb-edu dataset 600M tokens (100M validation) with RTX 4090 Laptop GPU (16GB).
The Transformers folder contains different implementation of GPT, a fine-tuning and a RAG example.
This file get some of the codes from Transformers for Natural Language Processing and Computer Vision from Denis Rothman and Natural Language Processing with Transformers from Lewis Tunstall, Leandro von Werra, Thomas Wolf.
Main purpose for this project is to make a series of internal company sessions about LLMs and their implementation details.

screenshot

Built With

  • Python==3.10.14
  • Pytorch==2.3.1+cu121

Installation

To set up and running these projects:

  1. Create a conda environment.
  2. Install the requirements.

conda create --name <ENV_NAME> python=3.10.14 --file requirements.txt

Authors

👤 Eyüp Sercan UYGUR

Srjnnnn(Twitter)
eyupsercanuygur.com

🤝 Contributing

Contributions, issues and feature requests are welcome!

Show your support

Give a ⭐️ if you like this project!

Issues

For issues check.

📝 License

This project is MIT licensed.

About

This will be a demonstration session in our company

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published