capybara-brain346/moe-router

A small Mixture-of-Experts (MoE) Transformer trained from scratch to learn how sparse expert models work and to study their performance, routing behavior, and efficiency under realistic conditions.

17
/ 100
Experimental
No Package No Dependents
Maintenance 6 / 25
Adoption 2 / 25
Maturity 9 / 25
Community 0 / 25

How are scores calculated?

Stars

2

Forks

Language

Python

License

MIT

Last pushed

Nov 17, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/capybara-brain346/moe-router"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.