decodingai-magazine/llm-twin-course
🤖 𝗟𝗲𝗮𝗿𝗻 for 𝗳𝗿𝗲𝗲 how to 𝗯𝘂𝗶𝗹𝗱 an end-to-end 𝗽𝗿𝗼𝗱𝘂𝗰𝘁𝗶𝗼𝗻-𝗿𝗲𝗮𝗱𝘆 𝗟𝗟𝗠 & 𝗥𝗔𝗚 𝘀𝘆𝘀𝘁𝗲𝗺 using 𝗟𝗟𝗠𝗢𝗽𝘀 best practices: ~ 𝘴𝘰𝘶𝘳𝘤𝘦 𝘤𝘰𝘥𝘦 + 12 𝘩𝘢𝘯𝘥𝘴-𝘰𝘯 𝘭𝘦𝘴𝘴𝘰𝘯𝘴
Covers the complete MLOps pipeline across four microservices: data collection via web crawlers and CDC patterns into MongoDB, real-time feature engineering with Bytewax streaming into Qdrant vectors, fine-tuning with LoRA/QLoRA tracked in Comet ML, and inference deployment on AWS SageMaker with RAG enhancement and prompt monitoring via Opik. Integrates with Hugging Face model registry, Redis/Qdrant vector databases, RabbitMQ for event streaming, and AWS Lambda for serverless data collection.
4,297 stars. No commits in the last 6 months.
Stars
4,297
Forks
717
Language
Python
License
MIT
Category
Last pushed
Apr 26, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/decodingai-magazine/llm-twin-course"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.