EIGAT: Incorporating global information in local attention for knowledge representation learning

作者:

Highlights:

摘要

Graph Attention Networks (GATs) have proven a promising model that takes advantage of localized attention mechanism to perform knowledge representation learning (KRL) on graph-structure data, e.g., Knowledge Graphs (KGs). While such approaches model entities’ local pairwise importance, they lack the capability to model global importance relative to other entities of KGs. This causes such models to miss critical information in tasks where global information is also a significant component for the task, such as in knowledge representation learning. To address the issue, we allow the proper incorporation of global information into the GAT family of models through the use of scaled entity importance, which is calculated by an attention-based global random walk algorithm. In the context of KRL, incorporating global information boosts performance significantly. Experimental results on KG entity prediction against the state-of-the-arts sufficiently demonstrate the effectiveness of our proposed model.

论文关键词:Graph Neural Networks,Knowledge Graphs,Knowledge representation learning

论文评审过程:Received 2 June 2021, Revised 19 October 2021, Accepted 5 December 2021, Available online 10 December 2021, Version of Record 21 December 2021.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.107909