A multi-head attention-based transformer model for traffic flow forecasting with a comparative analysis to recurrent neural networks

作者:

Highlights:

• Applicability of transformers in traffic state forecasting is justified.

• A comprehensive performance comparison with GRU and LSTM is presented.

• Transformers need to be fed with big data to get good performance.

• Transformers are more suitable in gaining long-range features than GRU or LSTM.

• Proposed model improves the mean absolute percentage error by over related baselines.

摘要

•Applicability of transformers in traffic state forecasting is justified.•A comprehensive performance comparison with GRU and LSTM is presented.•Transformers need to be fed with big data to get good performance.•Transformers are more suitable in gaining long-range features than GRU or LSTM.•Proposed model improves the mean absolute percentage error by over related baselines.

论文关键词:Intelligent transportation system,Time-series forecasting,Deep learning,Long short-term memory,Gated recurrent unit,PeMS

论文评审过程:Received 11 February 2022, Revised 16 April 2022, Accepted 17 April 2022, Available online 21 April 2022, Version of Record 29 April 2022.

论文官网地址:https://doi.org/10.1016/j.eswa.2022.117275