Multi-object tracking through learning relational appearance features and motion patterns

作者:

Highlights:

摘要

Multi-object tracking (MOT) is to simultaneously track multiple targets, e.g., pedestrians in this work, through locating them and maintaining their identities to make their individual trajectories. Despite of recent advances in object detection, MOT based on the tracking-by-detection principle is a still yet challenging and difficult task in complex and crowded conditions. For example, due to occlusion, missed object detection, and frequent entering and leaving of object in a scene, tracking failures such as identity switches and trajectory fragmentation can often occur. To tackle the issues, a new data association approach, namely, the relational appearance features and motion patterns learning (RAFMPL)-based data association, is proposed for facilitating MOT. In RAFMPL-MOT, the proposed relational features-based appearance model is different from conventional approaches in that it generates tracklets based on relational information by selecting one reference object and utilizing the feature differences between the reference object and the other objects. In addition, the motion patterns learning-based motion model enables linear and nonlinear confident motions patterns to be considered in data association. The proposed approach can effectively cover the key difficulties of MOT. In particular, using RAFMPL-MOT, it is possible to assign the same ID for the object that has been disappeared (even for moderately long period) and then is reappeared in the scene more robustly. Further, it also improves its robustness for occlusion problems frequently occurring in real situations. The experimental results show that the RAFMPL-MOT could generally achieve outperformance compared to the existing competitive MOT approaches.

论文关键词:

论文评审过程:Received 7 September 2016, Revised 27 April 2017, Accepted 22 May 2017, Available online 23 May 2017, Version of Record 27 September 2017.

论文官网地址:https://doi.org/10.1016/j.cviu.2017.05.010