seetrex-ai/kuraformer
Reduce LLM inference compute by 4x with no accuracy loss. Oscillatory adapter for pretrained Transformers.
Stars
—
Forks
—
Language
Python
License
MIT
Category
Last pushed
Mar 14, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/seetrex-ai/kuraformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
gaussalgo/adaptor
ACL 2022: Adaptor: a library to easily adapt a language model to your own task, domain, or...
ylsung/VL_adapter
PyTorch code for "VL-Adapter: Parameter-Efficient Transfer Learning for Vision-and-Language...
intersun/LightningDOT
source code and pre-trained/fine-tuned checkpoint for NAACL 2021 paper LightningDOT
calpt/awesome-adapter-resources
Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning