Is that my hand? An egocentric dataset for hand disambiguation

作者:

Highlights:

• Build of an egocentric perspective image dataset for hand disambiguation with variability on people, places and activities.

• Dataset analysis showing its characteristics to provide information for future work, and comparison with previous datasets.

• The use of the context information (arms) alone in the image can be effective to detect hands.

• Proposal of three different joint neural network architectures to combine the hand and context information.

• Performance comparison with state-of-the-art methods to benchmark the dataset.

摘要

•Build of an egocentric perspective image dataset for hand disambiguation with variability on people, places and activities.•Dataset analysis showing its characteristics to provide information for future work, and comparison with previous datasets.•The use of the context information (arms) alone in the image can be effective to detect hands.•Proposal of three different joint neural network architectures to combine the hand and context information.•Performance comparison with state-of-the-art methods to benchmark the dataset.

论文关键词:Egocentric perspective,Hand detection

论文评审过程:Received 4 June 2019, Accepted 19 June 2019, Available online 5 July 2019, Version of Record 31 July 2019.

论文官网地址:https://doi.org/10.1016/j.imavis.2019.06.002