ycq091044/BIOT
BIOT - A framework for pretraining biosignals at scale. Large EEG pre-trained models.
Normalizes heterogeneous EEG data—varying channel counts, lengths, and missing values—into consistent token sequences processed by a transformer encoder, enabling cross-dataset pretraining. Provides multiple pretrained checkpoints (5M+ samples from resting EEG, sleep EEG, and seizure datasets) that transfer effectively to downstream tasks like abnormality detection. Includes baseline implementations of SPaRCNet, ContraWR, and ST-Transformer alongside supervised and unsupervised pretraining pipelines.
182 stars. No commits in the last 6 months.
Stars
182
Forks
36
Language
Python
License
MIT
Category
Last pushed
Dec 11, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ycq091044/BIOT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
VectorInstitute/odyssey
A toolkit for developing foundation models using Electronic Health Record (EHR) data.
AntixK/PyTorch-Model-Compare
Compare neural networks by their feature similarity
soda-inria/carte
Repository for CARTE: Context-Aware Representation of Table Entries
harryjdavies/HeartGPT
Interpretable Pre-Trained Transformers for Heart Time-Series Data
woodRock/fishy-business
Machine Learning for Rapid Evaporative Ionization Mass Spectrometry for Marine Biomass Analysis...