zhongshsh/MoExtend
ACL 2024 (SRW), Official Codebase of our Paper: "MoExtend: Tuning New Experts for Modality and Task Extension"
No commits in the last 6 months.
Stars
14
Forks
—
Language
Python
License
MIT
Category
Last pushed
Dec 03, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/zhongshsh/MoExtend"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
EfficientMoE/MoE-Infinity
PyTorch library for cost-effective, fast and easy serving of MoE models.
jaisidhsingh/pytorch-mixtures
One-stop solutions for Mixture of Expert modules in PyTorch.
raymin0223/mixture_of_recursions
Mixture-of-Recursions: Learning Dynamic Recursive Depths for Adaptive Token-Level Computation...
thu-nics/MoA
[CoLM'25] The official implementation of the paper
AviSoori1x/makeMoE
From scratch implementation of a sparse mixture of experts language model inspired by Andrej...