Transformers (Attention is all you need) Articles Transformers-An Intuition Text Generation v/s Text2Text Generation LLMs Parameters BERT: Overview GPT: Overview T5: Overview Repository Contents Types Encoder Only Models Decoder Only Models Encoder-Decoder Models Codes Tokenizer Word Embedding Semantic Search Tokenization Positional_Encoding Self Attention Extractive QnA with BERT Instruction following using GPT Product Review using T5 Decoder Only Model Fill-Mask Encoder Only Model Sentiment Analysis Encoder Only Model Encoder-Decoder Model Theory Stuffs Hand-written Notes Hyperparameters Basics Attention is all you need from scratch *WIP