Authors
Bryan Lim, Sercan Ö Arık, Nicolas Loeff, Tomas Pfister
Publication date
2021/10/1
Journal
International Journal of Forecasting
Volume
37
Issue
4
Pages
1748-1764
Publisher
Elsevier
Description
Multi-horizon forecasting often contains a complex mix of inputs – including static (i.e. time-invariant) covariates, known future inputs, and other exogenous time series that are only observed in the past – without any prior information on how they interact with the target. Several deep learning methods have been proposed, but they are typically ‘black-box’ models that do not shed light on how they use the full range of inputs present in practical scenarios. In this paper, we introduce the Temporal Fusion Transformer (TFT) – a novel attention-based architecture that combines high-performance multi-horizon forecasting with interpretable insights into temporal dynamics. To learn temporal relationships at different scales, TFT uses recurrent layers for local processing and interpretable self-attention layers for long-term dependencies. TFT utilizes specialized components to select relevant features and a series of gating …
Total citations
202020212022202320241988276551401
Scholar articles