CASE-Lab-UMD/LLM-Drop
The official implementation of the paper "Uncovering the Redundancy in Transformers via a Unified Study of Layer Dropping (TMLR)".
48
/ 100
Emerging
189 stars.
No Package
No Dependents
Maintenance
13 / 25
Adoption
10 / 25
Maturity
9 / 25
Community
16 / 25
Stars
189
Forks
24
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 06, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/CASE-Lab-UMD/LLM-Drop"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
92
mosaicml/llm-foundry
LLM training code for Databricks foundation models
71
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
71
mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless...
69
ridgerchu/matmulfreellm
Implementation for MatMul-free LM.
50