Long Time Series Deep Forecasting with Multiscale Feature Extraction and Seq2seq Attention Mechanism

作者:Xin Wang, Zhiming Cai, Yixian Luo, Zhijie Wen, Shihui Ying

摘要

Long time series forecasting is an important problem with applications in many fields, such as weather forecasting, stock prediction, petroleum production prediction and heating load forecasting. In recent years, the most popular methods for long time series forecasting pay attention to extract local information at a single scale based on Convolutional Neural Network (CNN). Moreover, these methods utilize the basic attention mechanism to select the relevant information from previous time steps for generating the better outputs. However, long time series have rich information at different scales and basic attention mechanism is not appropriate for directly predicting a future sequence. In this paper, we propose a long time series forecasting method by utilizing Multi-scale feature extraction and Sequence-to-sequence (seq2seq) attention mechanism in the hidden state of Long Short-Term Memory (LSTM), which is named MS-LSTM. Concretely, MS-LSTM is inspired by seq2seq attention mechanism to generate the output sequences by making the best of previous information. Multiscale feature extraction utilizes CNNs with different convolution kernels to extract short-term features at different scales. The comparison and ablation experiments on the exchange rate dataset show that the proposed method achieves significant improvements over that of several state-of-the-art methods. We also apply the proposed model on a time series dataset of industrial equipment indexes provided by Shanghai Zhoubang Information Technology Company, and achieve state-of-the-art results in all cases.

论文关键词:Long time series forecasting, LSTM, Multiscale, Seq2seq attention mechanism

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-022-10774-0