kyegomez/SingLoRA

This repository provides a minimal, single-file implementation of SingLoRA (Single Matrix Low-Rank Adaptation) as described in the paper "SingLoRA: Low Rank Adaptation Using a Single Matrix" by Bensaïd et al.

52
/ 100
Established

Replaces dual low-rank matrices with a single trainable matrix applied across transformer attention layers, incorporating a time-dependent ramp-up function to gradually scale the adaptation during training. Integrates directly with Hugging Face Transformers models (DistilBERT, LLaMA) via drop-in layer replacement, achieving ~15% parameter reduction while maintaining fine-tuning capability on selective attention projections (q_proj, k_proj, v_proj).

Available on PyPI.

Maintenance 13 / 25
Adoption 10 / 25
Maturity 24 / 25
Community 5 / 25

How are scores calculated?

Stars

44

Forks

2

Language

Python

License

MIT

Last pushed

Mar 09, 2026

Monthly downloads

7

Commits (30d)

0

Dependencies

2

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/SingLoRA"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.