happy-llm and llm-universe
Both are tutorials from Datawhale China, with the former providing a foundational understanding of LLM principles and practices, while the latter focuses on the application development of large models for beginners, making them complementary resources in the LLM learning ecosystem.
About happy-llm
datawhalechina/happy-llm
📚 从零开始的大语言模型原理与实践教程
Covers foundational NLP concepts through practical LLM implementation, with structured chapters progressing from Transformer architecture and attention mechanisms to hands-on model building using PyTorch. Includes end-to-end training workflows (pretraining, supervised fine-tuning, LoRA optimization) and applications like RAG and agent systems, with downloadable pretrained 215M parameter models and companion code implementations.
About llm-universe
datawhalechina/llm-universe
本项目是一个面向小白开发者的大模型应用开发教程,在线阅读地址:https://datawhalechina.github.io/llm-universe/
Covers unified API wrappers for major domestic and international LLM providers (GPT, Baidu Wenxin, iFlytek Spark, Zhipu GLM) alongside LangChain integration, enabling consistent multi-model invocation without API-specific implementation details. Teaches RAG architecture through a practical personal knowledge base assistant project, combining document loading/chunking, vector database construction with embedding APIs, and Streamlit deployment—all executable on standard hardware without GPU requirements. Structured in three progressive tracks: foundational LLM application development, advanced RAG optimization techniques (hybrid retrieval, prompt engineering, fine-tuning), and open-source project case studies.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work