kyegomez/LIMoE
Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
Available on PyPI.
Stars
36
Forks
2
Language
Python
License
MIT
Category
Last pushed
Jan 31, 2026
Monthly downloads
24
Commits (30d)
0
Dependencies
5
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/LIMoE"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
dohlee/chromoformer
The official code implementation for Chromoformer in PyTorch. (Lee et al., Nature Communications. 2022)
ahans30/goldfish-loss
[NeurIPS 2024] Goldfish Loss: Mitigating Memorization in Generative LLMs
yinboc/trans-inr
Transformers as Meta-Learners for Implicit Neural Representations, in ECCV 2022
bloomberg/MixCE-acl2023
Implementation of MixCE method described in ACL 2023 paper by Zhang et al.