adapters and torch-adapters
The adapter-hub/adapters library is a comprehensive, production-ready framework for parameter-efficient transfer learning across multiple model architectures, while torch-adapters is a minimal PyTorch implementation of adapter modules that could serve as a lightweight alternative or educational reference rather than a complement to the more established ecosystem.
About adapters
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
Integrates 10+ parameter-efficient fine-tuning methods (LoRA, prefix tuning, bottleneck adapters, etc.) into 20+ HuggingFace Transformer models via a unified API. Supports advanced composition patterns like adapter merging via task arithmetic and parallel/sequential adapter stacking, plus quantized training variants (Q-LoRA, Q-Bottleneck). Built as a drop-in extension to the Transformers library with minimal code changes needed for both training and inference.
About torch-adapters
ma2za/torch-adapters
Small Library of PyTorch Adaptation modules
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work