Skip to content
#

longformer

Here are 24 public repositories matching this topic...

Master thesis with code investigating methods for incorporating long-context reasoning in low-resource languages, without the need to pre-train from scratch. We investigated if multilingual models could inherit these properties by making it an Efficient Transformer (s.a. the Longformer architecture).

  • Updated Aug 19, 2021
  • Jupyter Notebook

This GitHub repository implements a novel approach for detecting Initial Public Offering (IPO) underpricing using pre-trained Transformers. The models, extended to handle large S-1 filings, leverage both textual information and financial indicators, outperforming traditional machine learning methods.

  • Updated Dec 2, 2024
  • Python

This project applies the Longformer model to sentiment analysis using the IMDB movie review dataset. The Longformer model, introduced in "Longformer: The Long-Document Transformer," tackles long document processing with sliding-window and global attention mechanisms. The implementation leverages PyTorch, following the paper's architecture

  • Updated Apr 7, 2023
  • Python

This project was developed for a Kaggle competition focused on detecting Personally Identifiable Information (PII) in student writing. The primary objective was to build a robust model capable of identifying PII with high recall. The DeBERTa v3 transformer model was chosen for this task after comparing its performance with other transformer models.

  • Updated Jun 28, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the longformer topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the longformer topic, visit your repo's landing page and select "manage topics."

Learn more