Learning with Hilbert–Schmidt independence criterion: A review and new perspectives
作者:
Highlights:
•
摘要
The Hilbert–Schmidt independence criterion (HSIC) was originally designed to measure the statistical dependence of the distribution-based Hilbert space embedding in statistical inference. In recent years, it has been witnessed that this criterion can tackle a large number of learning problems owing to its effectiveness and high efficiency. In this article, we provide an in-depth survey of learning methods using the HSIC for various learning problems, like feature selection, dimensionality reduction, clustering, and kernel learning and optimization. Specifically, after introducing the basic idea of HISC, we systematically review the typical learning models based on the HISC, ranging from supervised learning to unsupervised learning, as well as from traditional machine learning to transfer learning and deep learning, followed by remaining challenges and future directions. The relationships between learning methods using the HSIC and other relevant learning algorithms are also discussed. We expect to provide practitioners valuable guidelines for their specific domains by elucidating the similarities and differences of these learning models.
论文关键词:Hilbert–Schmidt independence criterion (HSIC),Feature selection,Dimensionality reduction,Clustering,Kernel method,Machine learning
论文评审过程:Received 19 May 2021, Revised 11 September 2021, Accepted 2 October 2021, Available online 5 October 2021, Version of Record 23 October 2021.
论文官网地址:https://doi.org/10.1016/j.knosys.2021.107567