SakanaAI/doc-to-lora

Hypernetworks that update LLMs to remember factual information

48
/ 100
Emerging

Uses a hypernetwork architecture to generate LoRA weights from document contexts, enabling LLMs to instantly internalize and ground responses in provided information without fine-tuning. The approach trains a modulation module that dynamically adapts model behavior based on input documents, supporting both single and batched inference. Provides pre-trained checkpoints on Hugging Face, interactive web demo, and experimental scripts for reproduction including needle-in-haystack evaluation.

545 stars.

No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 11 / 25
Community 17 / 25

How are scores calculated?

Stars

545

Forks

57

Language

Python

License

MIT

Last pushed

Mar 02, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/SakanaAI/doc-to-lora"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.