A PyTorch implementation of the Transformer model in "Attention is All You Need".
-
Updated
Apr 16, 2024 - Python
A PyTorch implementation of the Transformer model in "Attention is All You Need".
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
list of efficient attention modules
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Attention is all you need implementation
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
A Benchmark of Text Classification in PyTorch
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Neural Machine Translation with Keras
USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
An open source implementation of "Scaling Autoregressive Multi-Modal Models: Pretraining and Instruction Tuning", an all-new multi modal AI that uses just a decoder to generate both text and images
Implementation of the ScreenAI model from the paper: "A Vision-Language Model for UI and Infographics Understanding"
[CVPR 2024] Official implementation of CVPR 2024 paper: "Inversion-Free Image Editing with Natural Language"
Attention Is All You Need | a PyTorch Tutorial to Transformers
Add a description, image, and links to the attention-is-all-you-need topic page so that developers can more easily learn about it.
To associate your repository with the attention-is-all-you-need topic, visit your repo's landing page and select "manage topics."