yelabb/PhantomX
On the Limits of Discrete Representations for Neural Control. A systematic empirical study of tokenization, quantization, and inductive bias in BCI (aka documented failures)
Stars
2
Forks
—
Language
Python
License
MIT
Category
Last pushed
Jan 23, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/yelabb/PhantomX"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
VectorInstitute/odyssey
A toolkit for developing foundation models using Electronic Health Record (EHR) data.
ycq091044/BIOT
BIOT - A framework for pretraining biosignals at scale. Large EEG pre-trained models.
AntixK/PyTorch-Model-Compare
Compare neural networks by their feature similarity
soda-inria/carte
Repository for CARTE: Context-Aware Representation of Table Entries
harryjdavies/HeartGPT
Interpretable Pre-Trained Transformers for Heart Time-Series Data