reshalfahsi/gpt2moe-instruct
Instruction Fine-tuning of the GPT2MoE Model: GPT-2 with Mixture-of-Experts
No commits in the last 6 months.
Stars
1
Forks
—
Language
Jupyter Notebook
License
—
Category
Last pushed
Apr 28, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/reshalfahsi/gpt2moe-instruct"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
InternLM/xtuner
A Next-Generation Training Engine Built for Ultra-Large MoE Models
AmanPriyanshu/GPT-OSS-MoE-ExpertFingerprinting
ExpertFingerprinting: Behavioral Pattern Analysis and Specialization Mapping of Experts in...
arm-education/Advanced-AI-Mixture-of-Experts
Hands-on course materials for ML engineers to implement and optimize Mixture of Experts models:...
SuperBruceJia/Awesome-Mixture-of-Experts
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of...
sumitdotml/moe-emergence
a project highlighting the emergent expert specialization in Mixture of Experts (MoEs) across 3...