XAI for myo-controlled prosthesis: Explaining EMG data for hand gesture classification

作者:

Highlights:

摘要

Machine Learning has recently found a fertile ground in EMG signal decoding for prosthesis control. However, its understanding and acceptance are strongly limited by the notion of AI models as black-boxes. In critical fields, such as medicine and neuroscience, understanding the neurophysiological phenomena underlying models’ outcomes is as relevant as the classification performances. In this work, we adapt state-of-the-art XAI algorithms to EMG hand gesture classification to understand the outcome of machine learning models with respect to physiological processes, evaluating the contribution of each input feature to the prediction and showing that AI models recognize the hand gestures by mapping and fusing efficiently high amplitude activity of synergic muscles.This allows us to (i) drastically reduce the number of required electrodes without a significant loss in classification performances, ensuring the suitability of the system for a larger population of amputees and simplifying the realization of near real-time applications and (ii) perform an efficient selection of features based on their classification relevance, apprehended by the XAI algorithms. This feature selection leads to classification improvements in term of robustness and computational time, outperforming correlation based methods. Finally, (iii) comparing the physiological explanations produced by the XAI algorithms with the experimental setting highlights inconsistencies in the electrodes positioning over different rounds or users, then improving the overall quality of the process.

论文关键词:EMG signal decoding,eXplainable AI,Myo-controlled prosthesis

论文评审过程:Received 15 September 2021, Revised 25 November 2021, Accepted 24 December 2021, Available online 4 January 2022, Version of Record 29 January 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.108053