kmeng01/memit
Mass-editing thousands of facts into a transformer memory (ICLR 2023)
Implements rank-one model editing by identifying and modifying key memory neurons in transformer MLPs, enabling efficient batch updates to factual knowledge without full retraining. Targets GPT-style language models (tested on GPT-J, GPT-2) through a Python API that accepts structured rewrite requests specifying subject-prompt-target triples. Includes comprehensive evaluation tools measuring edit success, generalization, and side effects across thousands of simultaneous edits.
543 stars. No commits in the last 6 months.
Stars
543
Forks
72
Language
Python
License
MIT
Category
Last pushed
Jan 31, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kmeng01/memit"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
steering-vectors/steering-vectors
Steering vectors for transformer language models in Pytorch / Huggingface
jianghoucheng/AlphaEdit
AlphaEdit: Null-Space Constrained Knowledge Editing for Language Models, ICLR 2025 (Outstanding Paper)
boyiwei/alignment-attribution-code
[ICML 2024] Assessing the Brittleness of Safety Alignment via Pruning and Low-Rank Modifications
jianghoucheng/AnyEdit
AnyEdit: Edit Any Knowledge Encoded in Language Models, ICML 2025
zjunlp/KnowledgeCircuits
[NeurIPS 2024] Knowledge Circuits in Pretrained Transformers