Carefully chosen temporal smoothness regularisers can significantly improve the accuracy of neural link prediction in temporal knowledge graphs, with simpler tensor factorization models sometimes outperforming more complex approaches.
The proposed IME model effectively captures the complex geometric structures of temporal knowledge graphs by simultaneously modeling them in multi-curvature spaces, including hyperspherical, hyperbolic, and Euclidean spaces. IME learns both space-shared and space-specific properties to mitigate spatial gaps and comprehensively capture characteristic features across different curvature spaces.
Quaternion embeddings in hypercomplex space enhance temporal knowledge graph completion by capturing time-sensitive relations and periodic patterns.
Proposing a novel TKGC model, PPT, that leverages pre-trained language models with prompts to effectively incorporate temporal information from knowledge graphs.
The author proposes a novel model, PPT, that leverages pre-trained language models and prompts to enhance temporal knowledge graph completion by accurately extracting temporal information and utilizing implied relations. The approach converts the task into a masked token prediction problem.