Word and graph attention networks for semi-supervised classification

作者:Jing Zhang, Mengxi Li, Kaisheng Gao, Shunmei Meng, Cangqi Zhou

摘要

Graph attention networks are effective graph neural networks that perform graph embedding for semi-supervised learning, which considers the neighbors of a node when learning its features. This paper presents a novel attention-based graph neural network that introduces an attention mechanism in the word-represented features of a node together incorporating the neighbors’ attention in the embedding process. Instead of using a vector as the feature of a node in the traditional graph attention networks, the proposed method uses a 2D matrix to represent a node, where each row in the matrix stands for a different attention distribution against the original word-represented features of a node. Then, the compressed features are fed into a graph attention layer that aggregates the matrix representation of the node and its neighbor nodes with different attention weights as a new representation. By stacking several graph attention layers, it obtains the final representation of nodes as matrices, which considers both that the neighbors of a node have different importance and that the words also have different importance in their original features. Experimental results on three citation network datasets show that the proposed method significantly outperforms eight state-of-the-art methods in semi-supervised classification tasks.

论文关键词:Graph neural networks, Graph embedding, Graph attention, Word attention, Semi-supervised classification

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10115-021-01610-3