decodingai-magazine/llm-twin-course

🤖 𝗟𝗲𝗮𝗿𝗻 for 𝗳𝗿𝗲𝗲 how to 𝗯𝘂𝗶𝗹𝗱 an end-to-end 𝗽𝗿𝗼𝗱𝘂𝗰𝘁𝗶𝗼𝗻-𝗿𝗲𝗮𝗱𝘆 𝗟𝗟𝗠 & 𝗥𝗔𝗚 𝘀𝘆𝘀𝘁𝗲𝗺 using 𝗟𝗟𝗠𝗢𝗽𝘀 best practices: ~ 𝘴𝘰𝘶𝘳𝘤𝘦 𝘤𝘰𝘥𝘦 + 12 𝘩𝘢𝘯𝘥𝘴-𝘰𝘯 𝘭𝘦𝘴𝘴𝘰𝘯𝘴

51
/ 100
Established

Covers the complete MLOps pipeline across four microservices: data collection via web crawlers and CDC patterns into MongoDB, real-time feature engineering with Bytewax streaming into Qdrant vectors, fine-tuning with LoRA/QLoRA tracked in Comet ML, and inference deployment on AWS SageMaker with RAG enhancement and prompt monitoring via Opik. Integrates with Hugging Face model registry, Redis/Qdrant vector databases, RabbitMQ for event streaming, and AWS Lambda for serverless data collection.

4,297 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 23 / 25

How are scores calculated?

Stars

4,297

Forks

717

Language

Python

License

MIT

Last pushed

Apr 26, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mlops/decodingai-magazine/llm-twin-course"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.