MDalamin5/Build-and-Finetune-LLM-From-Scratch-Deploy-via-vLLM-AWS-GCP
A complete end-to-end learning repo covering everything from building Large Language Models (LLMs) from scratch to mastering practical deep learning with PyTorch. Includes tokenizer coding, transformers, attention, training loops, model finetuning, and hands-on PyTorch projects.
Stars
—
Forks
1
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Mar 15, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/MDalamin5/Build-and-Finetune-LLM-From-Scratch-Deploy-via-vLLM-AWS-GCP"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Lightning-AI/litgpt
20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.
liangyuwang/Tiny-DeepSpeed
Tiny-DeepSpeed, a minimalistic re-implementation of the DeepSpeed library
catherinesyeh/attention-viz
Visualizing query-key interactions in language + vision transformers (VIS 2023)
microsoft/Text2Grad
🚀 Text2Grad: Converting natural language feedback into gradient signals for precise model...
huangjia2019/llm-gpt
From classic NLP to modern LLMs: building language models step by step. 异æ¥å›¾ä¹¦ï¼šã€Š GPT图解 å¤§æ¨¡åž‹æ˜¯æ€Žæ ·æž„å»ºçš„ã€‹-...