EA-LSTM: Evolutionary attention-based LSTM for time series prediction

作者:

Highlights:

摘要

Time series prediction with deep learning methods, especially Long Short-term Memory Neural Network (LSTM), have scored significant achievements in recent years. Despite the fact that LSTM can help to capture long-term dependencies, its ability to pay different degree of attention on sub-window feature within multiple time-steps is insufficient. To address this issue, an evolutionary attention-based LSTM training with competitive random search is proposed for multivariate time series prediction. By transferring shared parameters, an evolutionary attention learning approach is introduced to LSTM. Thus, like that for biological evolution, the pattern for importance-based attention sampling can be confirmed during temporal relationship mining. To refrain from being trapped into partial optimization like traditional gradient-based methods, an evolutionary computation inspired competitive random search method is proposed, which can well configure the parameters in the attention layer. Experimental results have illustrated that the proposed model can achieve competetive prediction performance compared with other baseline methods.

论文关键词:Evolutionary computation,Deep neural network,Time series prediction

论文评审过程:Received 15 November 2018, Revised 28 February 2019, Accepted 18 May 2019, Available online 27 May 2019, Version of Record 16 August 2019.

论文官网地址:https://doi.org/10.1016/j.knosys.2019.05.028