Knowledge graph embedding by reflection transformation

作者:

Highlights:

摘要

Due to the incompleteness of knowledge graph, knowledge graph embedding(KGE) has become a key technique for automatically predict missing facts in knowledge graph. KGE aims to learn low dimensional representations for both relations and entities. Despite existing KGE models achieved state-of-the-art(SOTA) performances, modeling and inferring relation connectivity(such as symmetry/antisymmetry, inversion, and composition), as well as complex relations prediction(such as M-to-1, 1-to-M and M-to-M) still have great challenges. In this paper, we propose a new KGE model called ReflectE, which regards each relation as a normal vector of a relation-specific reflection hyperplane. Specifically, ReflectE regards the tail entity(or head entity) in a triple as the reflection of the head entity(or tail entity) on a relation-specific hyperplane. Therefore it can model symmetric and inverse relations by reflection transformation naturally. Furthermore, ReflectE models complex relations by learning a relation-specific dynamic reflection hyperplane. In order to evaluate the effectiveness of our proposed model ReflectE, we choose previous SOTA models as baselines and conduct link prediction task on three popular datasets. Experimental results show that, compared with conventional distance-based KGE models, ReflectE achieves SOTA results for link prediction.

论文关键词:Knowledge graph embedding,Reflection transformation,Link prediction

论文评审过程:Received 11 August 2021, Revised 15 October 2021, Accepted 2 December 2021, Available online 8 December 2021, Version of Record 22 December 2021.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.107861