JointE: Jointly utilizing 1D and 2D convolution for knowledge graph embedding

作者:

Highlights:

摘要

Knowledge graph embedding is a popular method to predict missing links for knowledge graphs by projecting entities and relations into continuous low-dimension embeddings. Some recent embedding models employ translation-based operations to learn the representations of entities and relations with shallow and linear structures, and others leverage neural networks, especially convolution neural networks, to embed the entities and relations with deep and non-linear structures. However, shallow and linear models limit the extraction capacity of the latent knowledge while deep and non-linear models lead to the overabundance of parameters and the loss of surface and explicit knowledge. In this paper, we propose JointE, which utilizes 1D and 2D convolution operations jointly to alleviate these issues effectively. More specifically, we utilize 2D convolution operations to facilitate the interactions between entities and relations, thereby capturing the latent knowledge sufficiently. To reduce the number of parameters significantly, we innovatively construct 2D convolution filters from internal embeddings rather than using external filters which costs plenty of redundant parameters. Furthermore, we appropriately employ 1D convolution filters over input embeddings to extract the surface and explicit knowledge and preserve it by element-wise addition. Experimental evaluation on five benchmark datasets demonstrates that our model outperforms all other state-of-the-art convolution-based models and simultaneously enhances the parameter efficiency.

论文关键词:Knowledge graph embedding,Knowledge graph,Convolution networks

论文评审过程:Received 3 August 2021, Revised 18 December 2021, Accepted 30 December 2021, Available online 5 January 2022, Version of Record 24 January 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.108100