Llm Knowledge Distillation ML Frameworks

There are 3 llm knowledge distillation frameworks tracked. The highest-rated is lasgroup/SDPO at 46/100 with 627 stars.

Get all 3 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=ml-frameworks&subcategory=llm-knowledge-distillation&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Framework Score Tier
1 lasgroup/SDPO

Reinforcement Learning via Self-Distillation (SDPO)

46
Emerging
2 machinelearningnuremberg/DPL

[NeurIPS 2023] Multi-fidelity hyperparameter optimization with deep power...

28
Experimental
3 HUST-AI-HYZ/FARMS

Open source code for ICML 2025 Paper: Eigenspectrum Analysis of Neural...

18
Experimental