You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Lite & Super-fast re-ranking for your search & retrieval pipelines. Supports SoTA Listwise and Pairwise reranking based on LLMs and cross-encoders and more. Created by Prithivi Da, open for PRs & Collaborations.
Improving Bilingual Lexicon Induction with Cross-Encoder Reranking (Findings of EMNLP 2022). Keywords: Bilingual Lexicon Induction, Word Translation, Cross-Lingual Word Embeddings.
This repository showcases a comprehensive approach to information retrieval, document re-ranking, and language model integration. It incorporates techniques such as document chunking, embedding projection, and automatic query expansion to enhance the effectiveness of information retrieval systems.
Baseline models for searching for movie plots from Wikipedia articles. Techniques include BM25 (lexical search), bi/cross-encoding (semantic search), and retrieval-augmented generation (RAG) using Mistal 7B through Fireworks.ai.