adapters and adaptor
These are competitors offering overlapping approaches to parameter-efficient adaptation in language models, though adapter-hub/adapters is a more mature and widely-adopted unified framework while adaptor targets task-specific fine-tuning with custom objectives.
About adapters
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
Integrates 10+ parameter-efficient fine-tuning methods (LoRA, prefix tuning, bottleneck adapters, etc.) into 20+ HuggingFace Transformer models via a unified API. Supports advanced composition patterns like adapter merging via task arithmetic and parallel/sequential adapter stacking, plus quantized training variants (Q-LoRA, Q-Bottleneck). Built as a drop-in extension to the Transformers library with minimal code changes needed for both training and inference.
About adaptor
gaussalgo/adaptor
ACL 2022: Adaptor: a library to easily adapt a language model to your own task, domain, or custom objective(s).
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work