Knowledge graph attention mechanism for distant supervision neural relation extraction

作者:

Highlights:

摘要

Distant supervision neural relation extraction is used to classify the relation labels of instance bags with the same head–tail entity. Since there are usually multirelation labels in an entity pair, intensive noise occurs in the instance bag. Recent works mostly focused on developing an attention mechanism to degrade the weights of some sentences whose ground-truth relation labels are different from the instance bag. However, the general weakness is the failure to explore the semantic correlations between entity pair and its context. Additionally, the number of relation categories follows a long-tail distribution, and it is still a challenge to extract long-tail relations. Therefore, the Knowledge Graph ATTention (KGATT) mechanism is proposed to deal with the noises and long-tail problem, and it contains two modules: a fine-alignment mechanism and an inductive mechanism. In particular, the fine-alignment mechanism is used to learn the ground-truth relation of the sentence itself by aligning with all predefined relations. The inductive mechanism is employed to learn the enhanced relations from neighbors of the knowledge graph (KG) to compensate for data scarcity. With the mutual reinforcement of the two modules, our model can enrich the representation of the instance bag, which not only improves the generalization ability but also eases the long-tail phenomenon. Extensive experiments and ablation studies are conducted on the NYT-FB60K and GIDS-FB8K datasets, and the results show the KGATT is effective in improving performance. Based on Piecewise Convolution Neural Network (PCNN), our model achieves superior performances in various indicators as well as the long-tail relations.

论文关键词:Knowledge graph,Long-tail phenomenon,Fine-alignment mechanism,Inductive mechanism

论文评审过程:Received 9 February 2022, Revised 24 August 2022, Accepted 26 August 2022, Available online 31 August 2022, Version of Record 13 September 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109800