An eye–hand data fusion framework for pervasive sensing of surgical activities

作者:

Highlights:

摘要

This paper describes a generic framework for activity recognition based on temporal signals acquired from multiple input modalities and demonstrates its use for eye–hand data fusion. As a part of the data fusion framework, we present a multi-objective Bayesian Framework for Feature Selection with a pruned-tree search algorithm for finding the optimal feature set(s) in a computationally efficient manner. Experiments on endoscopic surgical episode recognition are used to investigate the potential of using eye-tracking for pervasive monitoring of surgical operation and to demonstrate how additional information induced by hand motion can further enhance the recognition accuracy. With the proposed multi-objective BFFS algorithm, suitable feature sets both in terms of feature relevancy and redundancy can be identified with a minimal number of instruments being tracked.

论文关键词:Activity recognition,Eye–hand coordination,Feature selection,Multi-objective feature selection,Multi-objective BFFS,Surgical workflow classification

论文评审过程:Received 12 April 2011, Revised 10 January 2012, Accepted 11 January 2012, Available online 28 January 2012.

论文官网地址:https://doi.org/10.1016/j.patcog.2012.01.008