Context-aware road travel time estimation by coupled tensor decomposition based on trajectory data

作者:

Highlights:

摘要

Urban road travel time estimation and prediction on a citywide scale is a necessary and important task for recommending optimal travel paths. However, this problem has not yet been well addressed: most existing approaches face serious data sparsity issues, e.g., lack of sensor data on several road segments; and it is a difficult task to capture context patterns around the road and incorporate context-aware information into travel time estimation models. Because of this, we propose to utilize trajectory data to model road travel times as this type of data covers more urban road segments than data from traditional traffic monitoring systems. Moreover, the trajectory itself has involved both travel times and the context of road congestion. A general framework for context-aware road travel time estimation (CARTE) is then put forward. Specifically, we adopt a third-order tensor to model spatiotemporal road travel times by setting the congestion level as a third dimension. By incorporating another context-aware information, namely points of interest (POI), a coupled tensor decomposition algorithm is proposed to fill in missing data. Eventually, we propose an algorithm to calculate an ultimate two-dimensional (spatial and temporal) travel time matrix by weighting the congestion probabilities of each congestion level. The effectiveness of the CARTE was validated on two real-world datasets and it was compared to the state-of-the-art methods. The experimental results demonstrate that the proposed travel time prediction approach always achieves the best performance in terms of accuracy with different data sparsity and prediction horizons.

论文关键词:Sparse data,Context aware,Trajectory,Tensor decomposition,Road travel time

论文评审过程:Received 1 October 2020, Revised 12 March 2022, Accepted 12 March 2022, Available online 28 March 2022, Version of Record 2 April 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.108596