arcee-ai/DALM

Domain Adapted Language Modeling Toolkit - E2E RAG

45
/ 100
Emerging

Implements fully differentiable end-to-end RAG training that jointly optimizes retriever and decoder-only generator models (Llama, Falcon, GPT) using in-batch negatives for efficiency. Supports both retriever-only contrastive learning and joint RAG-e2e fine-tuning pipelines with synthetic data generation via the `dalm` CLI, compatible with any Hugging Face embedding or language model. Includes evaluation harness for retriever recall/hit-rate metrics and pre-built domain-adapted examples (patents, PubMed, SEC filings).

335 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 19 / 25

How are scores calculated?

Stars

335

Forks

46

Language

Python

License

Apache-2.0

Last pushed

Nov 08, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/arcee-ai/DALM"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.