SakanaAI/doc-to-lora
Hypernetworks that update LLMs to remember factual information
Uses a hypernetwork architecture to generate LoRA weights from document contexts, enabling LLMs to instantly internalize and ground responses in provided information without fine-tuning. The approach trains a modulation module that dynamically adapts model behavior based on input documents, supporting both single and batched inference. Provides pre-trained checkpoints on Hugging Face, interactive web demo, and experimental scripts for reproduction including needle-in-haystack evaluation.
545 stars.
Stars
545
Forks
57
Language
Python
License
MIT
Category
Last pushed
Mar 02, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/SakanaAI/doc-to-lora"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.