mangopy/direct-rag-learning
Official code for TOIS2026 "Direct Retrieval-augmented Optimization: Synergizing Knowledge Selection and Language Models"
Implements a dual-loop optimization framework where retrieval selectors and answer generators co-train through mutual feedback, rather than independently fine-tuning components. Built on ColBERT for retrieval and supports multiple LLM backends via vLLM, with standardized dataset preprocessing for Natural Questions, HotpotQA, MuSiQue, and other multi-hop QA benchmarks against Wikipedia corpora. Includes end-to-end evaluation pipelines and wandb integration for training monitoring.
276 stars.
Stars
276
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Jan 14, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/mangopy/direct-rag-learning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
denser-org/denser-retriever
An enterprise-grade AI retriever designed to streamline AI integration into your applications,...
rayliuca/T-Ragx
Enhancing Translation with RAG-Powered Large Language Models
neuml/rag
🚀 Retrieval Augmented Generation (RAG) with txtai. Combine search and LLMs to find insights with...
NovaSearch-Team/RAG-Retrieval
Unify Efficient Fine-tuning of RAG Retrieval, including Embedding, ColBERT, ReRanker.
RulinShao/retrieval-scaling
Official repository for "Scaling Retrieval-Based Langauge Models with a Trillion-Token Datastore".