Multi-attention concept-cognitive learning model: A perspective from conceptual clustering

作者:

Highlights:

摘要

Concept-cognitive learning (CCL), as a cognitive process, is an emerging field of simulating the human brain to learn concepts in the formal context. Simultaneously, attention is a core property of all perceptual and cognitive operations. Nevertheless, no current existing CCL models and conceptual clustering methods consider the impact of attention. In light of these observations, in this article, we present a novel concept learning method, called the multi-attention concept-cognitive learning model (MA-CLM), to address the issue by exploiting graph attention and the graph structure of the concept space. This model is deployed toward the goal of conceptual cognitive more reasonable: generate pseudo-concept with higher expected utility while taking into consideration making classification tasks more efficient. Specifically, a conceptual attention space is learned for each decision class via attribute attention. Furthermore, a new concept clustering and concept generation method based on graph attention was proposed based on the conceptual attention space. Comparative studies with S2CL over a total of nine UCI data sets validate the effectiveness and efficiency of concept clustering based on graph attention in concept-cognitive learning. In addition, we also performed a comparative evaluation of MA-CLM against several classical classification algorithms to demonstrate the excellent properties in classification tasks. Finally, the model is validated by concept generation on the handwritten numeral dataset MNIST.

论文关键词:Concept-cognitive learning,Concept lattices,Conceptual clustering,Graph attention,Granular computing

论文评审过程:Received 22 February 2022, Revised 12 July 2022, Accepted 13 July 2022, Available online 19 July 2022, Version of Record 1 August 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109472