Parameter Efficient Adapters Transformer Models
Tools and libraries for implementing adapter modules, LoRA, and other parameter-efficient transfer learning methods for transformers. Includes adapter frameworks, modular fine-tuning approaches, and techniques to reduce trainable parameters. Does NOT include full model fine-tuning, general compression/pruning methods, or domain-specific applications without adapter focus.
There are 18 parameter efficient adapters models tracked. 1 score above 70 (verified tier). The highest-rated is adapter-hub/adapters at 82/100 with 2,802 stars and 86,888 monthly downloads. 1 of the top 10 are actively maintained.
Get all 18 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=transformers&subcategory=parameter-efficient-adapters&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Model | Score | Tier |
|---|---|---|---|
| 1 |
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning |
|
Verified |
| 2 |
gaussalgo/adaptor
ACL 2022: Adaptor: a library to easily adapt a language model to your own... |
|
Emerging |
| 3 |
ylsung/VL_adapter
PyTorch code for "VL-Adapter: Parameter-Efficient Transfer Learning for... |
|
Emerging |
| 4 |
intersun/LightningDOT
source code and pre-trained/fine-tuned checkpoint for NAACL 2021 paper LightningDOT |
|
Emerging |
| 5 |
kyegomez/M2PT
Implementation of M2PT in PyTorch from the paper: "Multimodal Pathway:... |
|
Experimental |
| 6 |
calpt/awesome-adapter-resources
Collection of Tools and Papers related to Adapters / Parameter-Efficient... |
|
Experimental |
| 7 |
ma2za/torch-adapters
Small Library of PyTorch Adaptation modules |
|
Experimental |
| 8 |
adapter-hub/efficient-task-transfer
Research code for "What to Pre-Train on? Efficient Intermediate Task... |
|
Experimental |
| 9 |
seetrex-ai/kuraformer
Reduce LLM inference compute by 4x with no accuracy loss. Oscillatory... |
|
Experimental |
| 10 |
eladwf/adaptive-multirate-transformers
DSP-inspired multirate wrappers for GPT with adaptive hyperparameters and... |
|
Experimental |
| 11 |
IsaacRodgz/Multimodal-Adapters
Adapter modules with support for multimodal fusion of information (text,... |
|
Experimental |
| 12 |
itsShnik/adaptively-finetuning-transformers
Adaptively fine tuning transformer based models for multiple domains and... |
|
Experimental |
| 13 |
kyegomez/PaLM2-VAdapter
Implementation of "PaLM2-VAdapter:" from the multi-modal model paper:... |
|
Experimental |
| 14 |
mobarski/aidapter
Adapter / facade for language models (OpenAI, Anthropic, Cohere, local... |
|
Experimental |
| 15 |
gbyuvd/ImadaremV
Implicitly Adaptive Refinement Model V |
|
Experimental |
| 16 |
mlsw/partial-embedding-matrix-adaptation
Vocabulary-level memory efficiency for language model fine-tuning. |
|
Experimental |
| 17 |
thesofakillers/CLAfICLe
Official repository for the paper "CLAfICLe: Cross-Lingual Adaptation for... |
|
Experimental |
| 18 |
TaoYang225/AD-DROP
Source code of NeurIPS 2022 accepted paper "AD-DROP: Attribution-Driven... |
|
Experimental |