ycq091044/BIOT

BIOT - A framework for pretraining biosignals at scale. Large EEG pre-trained models.

47
/ 100
Emerging

Normalizes heterogeneous EEG data—varying channel counts, lengths, and missing values—into consistent token sequences processed by a transformer encoder, enabling cross-dataset pretraining. Provides multiple pretrained checkpoints (5M+ samples from resting EEG, sleep EEG, and seizure datasets) that transfer effectively to downstream tasks like abnormality detection. Includes baseline implementations of SPaRCNet, ContraWR, and ST-Transformer alongside supervised and unsupervised pretraining pipelines.

182 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 21 / 25

How are scores calculated?

Stars

182

Forks

36

Language

Python

License

MIT

Last pushed

Dec 11, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ycq091044/BIOT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.