llm-foundry and llm-from-scratch
One provides a comprehensive, production-ready framework for training large language models on Databricks, while the other offers a hands-on, educational implementation for understanding LLM mechanics from first principles, making them complements for different stages of learning and development.
About llm-foundry
mosaicml/llm-foundry
LLM training code for Databricks foundation models
Implements end-to-end training, finetuning, evaluation, and inference pipelines with built-in support for efficiency techniques like Flash Attention and Mixture-of-Experts architectures. Integrates with Composer for distributed training optimization and MosaicML's platform for scalable workload orchestration, while supporting both HuggingFace and proprietary models (MPT, DBRX) from 125M to 132B parameters. Includes data preparation utilities for StreamingDataset format, inference export to ONNX/HuggingFace, and in-context learning evaluation on academic benchmarks.
About llm-from-scratch
rickiepark/llm-from-scratch
<밑바닥부터 만들면서 공부하는 LLM>(길벗, 2025)의 코드 저장소
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work