Abstract

Extending the forecasting time is a critical demand for real applications,\nsuch as extreme weather early warning and long-term energy consumption\nplanning. This paper studies the long-term forecasting problem of time series.\nPrior Transformer-based models adopt various self-attention mechanisms to\ndiscover the long-range dependencies. However, intricate temporal patterns of\nthe long-term future prohibit the model from finding reliable dependencies.\nAlso, Transformers have to adopt the sparse versions of point-wise\nself-attentions for long series efficiency, resulting in the information\nutilization bottleneck. Going beyond Transformers, we design Autoformer as a\nnovel decomposition architecture with an Auto-Correlation mechanism. We break\nwith the pre-processing convention of series decomposition and renovate it as a\nbasic inner block of deep models. This design empowers Autoformer with\nprogressive decomposition capacities for complex time series. Further, inspired\nby the stochastic process theory, we design the Auto-Correlation mechanism\nbased on the series periodicity, which conducts the dependencies discovery and\nrepresentation aggregation at the sub-series level. Auto-Correlation\noutperforms self-attention in both efficiency and accuracy. In long-term\nforecasting, Autoformer yields state-of-the-art accuracy, with a 38% relative\nimprovement on six benchmarks, covering five practical applications: energy,\ntraffic, economics, weather and disease. Code is available at this repository:\n\\url{https://github.com/thuml/Autoformer}.\n

Keywords

Computer scienceBottleneckTransformerData miningArtificial intelligenceIndustrial engineeringMachine learningEngineering

Related Publications

Publication Info

Year
2021
Type
preprint
Citations
1304
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1304
OpenAlex

Cite This

Haixu Wu, Jiehui Xu, Jianmin Wang et al. (2021). Autoformer: Decomposition Transformers with Auto-Correlation for\n Long-Term Series Forecasting. arXiv (Cornell University) . https://doi.org/10.48550/arxiv.2106.13008

Identifiers

DOI
10.48550/arxiv.2106.13008