litgpt and superGPT
Maintenance
13/25
Adoption
20/25
Maturity
25/25
Community
20/25
Maintenance
13/25
Adoption
4/25
Maturity
9/25
Community
9/25
Stars: 13,225
Forks: 1,409
Downloads: 15,196
Commits (30d): 5
Language: Python
License: Apache-2.0
Stars: 7
Forks: 1
Downloads: —
Commits (30d): 0
Language: Python
License: MIT
No risk flags
No Package
No Dependents
About litgpt
Lightning-AI/litgpt
20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.
Implements models from scratch without abstraction layers and combines Flash Attention with FSDP for distributed training across 1-1000+ GPUs/TPUs. Supports parameter-efficient finetuning via LoRA/QLoRA with mixed-precision quantization (fp4/8/16/32) to reduce GPU memory requirements, while integrating with PyTorch Lightning and Lightning Cloud infrastructure for end-to-end pretraining, finetuning, and deployment workflows through declarative YAML recipes.
About superGPT
viralcode/superGPT
Train your own LLM from scratch
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work