Tucker decomposition-based temporal knowledge graph completion

作者:

Highlights:

摘要

Knowledge graphs have been demonstrated to be an effective tool for numerous intelligent applications. However, a large amount of valuable knowledge still exists implicitly in the knowledge graphs. To enrich the existing knowledge graphs, recent years have witnessed that many algorithms for link prediction and knowledge graphs embedding have been designed to infer new facts. But most of these studies focus on the static knowledge graphs and ignore the temporal information which reflects the validity of knowledge. Developing the model for temporal knowledge graphs completion is an increasingly important task. In this paper, we build a new tensor decomposition model for temporal knowledge graphs completion inspired by the Tucker decomposition of order-4 tensor. Furthermore, to further improve the basic model performance, we provide three kinds of methods including cosine similarity, contrastive learning, and reconstruction-based to incorporate the prior knowledge into the proposed model. Because the core tensor contains a large number of parameters on the proposed model, thus we present two embedding regularization schemes to avoid the over-fitting problem. By combining these two kinds of regularization with the proposed model, our model outperforms baselines with an explicit margin on three temporal datasets (i.e. ICEWS2014, ICEWS05-15, GDELT).

论文关键词:Temporal knowledge graphs,Tucker decomposition,Reconstruction,Contrastive learning

论文评审过程:Received 17 July 2021, Revised 22 November 2021, Accepted 27 November 2021, Available online 17 December 2021, Version of Record 22 December 2021.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.107841