Mixup Augmentation Frameworks Transformer Models

There are 5 mixup augmentation frameworks models tracked. 1 score above 50 (established tier). The highest-rated is kyegomez/LIMoE at 51/100 with 36 stars and 24 monthly downloads.

Get all 5 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=transformers&subcategory=mixup-augmentation-frameworks&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Model Score Tier
1 kyegomez/LIMoE

Implementation of the "the first large-scale multimodal mixture of experts...

51
Established
2 dohlee/chromoformer

The official code implementation for Chromoformer in PyTorch. (Lee et al.,...

42
Emerging
3 ahans30/goldfish-loss

[NeurIPS 2024] Goldfish Loss: Mitigating Memorization in Generative LLMs

36
Emerging
4 yinboc/trans-inr

Transformers as Meta-Learners for Implicit Neural Representations, in ECCV 2022

36
Emerging
5 bloomberg/MixCE-acl2023

Implementation of MixCE method described in ACL 2023 paper by Zhang et al.

27
Experimental