qingsongedu/time-series-transformers-review

A professionally curated list of awesome resources (paper, code, data, etc.) on transformers in time series.

46
/ 100
Emerging

Organizes transformer architectures for time series into a taxonomy covering forecasting, classification, anomaly detection, and imputation tasks, with links to 100+ peer-reviewed papers from top venues (ICLR, NeurIPS, IJCAI). Highlights key innovations like decomposition-based attention (Autoformer), patching strategies (PatchTST), and frequency-domain enhancements (FEDformer) that address challenges in long-range dependencies and non-stationary patterns. Includes curated implementations and benchmark datasets to support reproducible research across forecasting horizons and multivariate time series scenarios.

2,968 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

2,968

Forks

271

Language

License

MIT

Last pushed

Aug 08, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/qingsongedu/time-series-transformers-review"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.