llm-foundry and llm-from-scratch

One provides a comprehensive, production-ready framework for training large language models on Databricks, while the other offers a hands-on, educational implementation for understanding LLM mechanics from first principles, making them complements for different stages of learning and development.

llm-foundry
71
Verified
llm-from-scratch
55
Established
Maintenance 6/25
Adoption 18/25
Maturity 25/25
Community 22/25
Maintenance 6/25
Adoption 9/25
Maturity 16/25
Community 24/25
Stars: 4,397
Forks: 584
Downloads: 4,165
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 97
Forks: 108
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: Apache-2.0
No risk flags
No Package No Dependents

About llm-foundry

mosaicml/llm-foundry

LLM training code for Databricks foundation models

Implements end-to-end training, finetuning, evaluation, and inference pipelines with built-in support for efficiency techniques like Flash Attention and Mixture-of-Experts architectures. Integrates with Composer for distributed training optimization and MosaicML's platform for scalable workload orchestration, while supporting both HuggingFace and proprietary models (MPT, DBRX) from 125M to 132B parameters. Includes data preparation utilities for StreamingDataset format, inference export to ONNX/HuggingFace, and in-context learning evaluation on academic benchmarks.

About llm-from-scratch

rickiepark/llm-from-scratch

<밑바닥부터 만들면서 공부하는 LLM>(길벗, 2025)의 코드 저장소

Related comparisons

Scores updated daily from GitHub, PyPI, and npm data. How scores work