Authors
Mohammed Aldosari, John A Miller
Publication date
2023
Journal
European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (Bruges, Belgium)
Description
The success of the Transformer model has promoted recent advances in time series forecasting. This adoption sparked an interest in developing efficient Transformer models that scale well for forecasting long sequences. This involves maintaining non-autoregressive one-time decoding. However, the role of autoregressive decoding is less explored. To address this gap, we revisit an essential idea of the vanilla Transformer model and show that autoregressive decoding works well compared to non-autoregressive decoding. It also becomes vital for critical forecasting tasks, such as pandemic forecasting, where the stakes are high. Our code and data are publicly available at https://github. com/maldosari1/ar_transformer_tf.
Total citations
Scholar articles
M Aldosari, JA Miller - European Symposium on Artificial Neural Networks …, 2023