bwconrad/soft-moe
PyTorch implementation of "From Sparse to Soft Mixtures of Experts"
No commits in the last 6 months. Available on PyPI.
Stars
68
Forks
3
Language
Python
License
Apache-2.0
Category
Last pushed
Aug 22, 2023
Monthly downloads
26
Commits (30d)
0
Dependencies
2
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/bwconrad/soft-moe"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AdaptiveMotorControlLab/CEBRA
Learnable latent embeddings for joint behavioral and neural analysis - Official implementation of CEBRA
ModSSC/ModSSC
ModSSC: A Modular Framework for Semi Supervised Classification
theolepage/sslsv
Toolkit for training and evaluating Self-Supervised Learning (SSL) frameworks for Speaker...
microsoft/Semi-supervised-learning
A Unified Semi-Supervised Learning Codebase (NeurIPS'22)
PaddlePaddle/PASSL
PASSL包含 SimCLR,MoCo v1/v2,BYOL,CLIP,PixPro,simsiam, SwAV, BEiT,MAE 等图像自监督算法以及 Vision...