arcee-ai/mergekit

Tools for merging pretrained large language models.

59
/ 100
Established

Supports multiple merge algorithms (linear interpolation, SLERP, task arithmetic, expert composition) and specialized techniques like LoRA extraction, layer-wise "Frankenmerging," and Mixture of Experts assembly. Uses an out-of-core lazy-loading approach to operate on CPU or minimal VRAM (8GB), enabling complex merges in resource-constrained environments. Integrates with Hugging Face Hub for model distribution and accepts YAML configurations for flexible multi-stage merge workflows across Llama, Mistral, GPT-NeoX, and other architectures.

6,857 stars. Actively maintained with 1 commit in the last 30 days.

No Package No Dependents
Maintenance 13 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

6,857

Forks

675

Language

Python

License

LGPL-3.0

Last pushed

Feb 28, 2026

Commits (30d)

1

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/arcee-ai/mergekit"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.