Mixture of Experts LLMs LLM Tools

Techniques, implementations, and optimizations for Mixture-of-Experts (MoE) architectures in language models, including compression, routing, expert specialization, and training methods. Does NOT include general model compression, non-MoE attention mechanisms, or domain-specific applications using MoE as a black box.

There are 12 mixture of experts llms tools tracked. 1 score above 70 (verified tier). The highest-rated is InternLM/xtuner at 86/100 with 5,096 stars and 1,643 monthly downloads. 1 of the top 10 are actively maintained.

Get all 12 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=llm-tools&subcategory=mixture-of-experts-llms&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Tool Score Tier
1 InternLM/xtuner

A Next-Generation Training Engine Built for Ultra-Large MoE Models

86
Verified
2 SuperBruceJia/Awesome-Mixture-of-Experts

Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE)...

38
Emerging
3 AmanPriyanshu/GPT-OSS-MoE-ExpertFingerprinting

ExpertFingerprinting: Behavioral Pattern Analysis and Specialization Mapping...

35
Emerging
4 arm-education/Advanced-AI-Mixture-of-Experts

Hands-on course materials for ML engineers to implement and optimize Mixture...

32
Emerging
5 rioyokotalab/optimal-sparsity

[ICLR 2026 Oral] Optimal Sparsity of Mixture-of-Experts Language Models for...

23
Experimental
6 sumitdotml/moe-emergence

a project highlighting the emergent expert specialization in Mixture of...

22
Experimental
7 robinzixuan/FROST

[ICLR 2026] FROST: Filtering Reasoning Outliers with Attention for Efficient...

22
Experimental
8 iahuang/cosmoe

Enabling inference of large mixture-of-experts (MoE) models on Apple Silicon...

22
Experimental
9 nusnlp/moece

The official code of the "Efficient and Interpretable Grammatical Error...

15
Experimental
10 lorenzflow/robust-moa

This is the official repository for the paper: This is your Doge: Exploring...

13
Experimental
11 Devanik21/HAG-MoE

HAG-MoE introduces a revolutionary approach to artificial intelligence by...

12
Experimental
12 braingpt-lovelab/matching_experts

Source code for

12
Experimental