qingsongedu/time-series-transformers-review
A professionally curated list of awesome resources (paper, code, data, etc.) on transformers in time series.
Organizes transformer architectures for time series into a taxonomy covering forecasting, classification, anomaly detection, and imputation tasks, with links to 100+ peer-reviewed papers from top venues (ICLR, NeurIPS, IJCAI). Highlights key innovations like decomposition-based attention (Autoformer), patching strategies (PatchTST), and frequency-domain enhancements (FEDformer) that address challenges in long-range dependencies and non-stationary patterns. Includes curated implementations and benchmark datasets to support reproducible research across forecasting horizons and multivariate time series scenarios.
2,968 stars. No commits in the last 6 months.
Stars
2,968
Forks
271
Language
—
License
MIT
Category
Last pushed
Aug 08, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/qingsongedu/time-series-transformers-review"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
amazon-science/chronos-forecasting
Chronos: Pretrained Models for Time Series Forecasting
SalesforceAIResearch/uni2ts
Unified Training of Universal Time Series Forecasting Transformers
ServiceNow/TACTiS
TACTiS-2: Better, Faster, Simpler Attentional Copulas for Multivariate Time Series, from...
moment-timeseries-foundation-model/moment
MOMENT: A Family of Open Time-series Foundation Models, ICML'24
yotambraun/APDTFlow
APDTFlow is a modern and extensible forecasting framework for time series data that leverages...