adapters and torch-adapters

The adapter-hub/adapters library is a comprehensive, production-ready framework for parameter-efficient transfer learning across multiple model architectures, while torch-adapters is a minimal PyTorch implementation of adapter modules that could serve as a lightweight alternative or educational reference rather than a complement to the more established ecosystem.

adapters
82
Verified
torch-adapters
27
Experimental
Maintenance 13/25
Adoption 22/25
Maturity 25/25
Community 22/25
Maintenance 0/25
Adoption 9/25
Maturity 18/25
Community 0/25
Stars: 2,802
Forks: 375
Downloads: 86,888
Commits (30d): 1
Language: Python
License: Apache-2.0
Stars: 9
Forks:
Downloads: 44
Commits (30d): 0
Language: Python
License: MIT
No risk flags
Stale 6m

About adapters

adapter-hub/adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Integrates 10+ parameter-efficient fine-tuning methods (LoRA, prefix tuning, bottleneck adapters, etc.) into 20+ HuggingFace Transformer models via a unified API. Supports advanced composition patterns like adapter merging via task arithmetic and parallel/sequential adapter stacking, plus quantized training variants (Q-LoRA, Q-Bottleneck). Built as a drop-in extension to the Transformers library with minimal code changes needed for both training and inference.

About torch-adapters

ma2za/torch-adapters

Small Library of PyTorch Adaptation modules

Scores updated daily from GitHub, PyPI, and npm data. How scores work