The author proposes a novel model, PPT, that leverages pre-trained language models and prompts to enhance temporal knowledge graph completion by accurately extracting temporal information and utilizing implied relations. The approach converts the task into a masked token prediction problem.
Proposing a novel TKGC model, PPT, that leverages pre-trained language models with prompts to effectively incorporate temporal information from knowledge graphs.
Quaternion embeddings in hypercomplex space enhance temporal knowledge graph completion by capturing time-sensitive relations and periodic patterns.
The proposed IME model effectively captures the complex geometric structures of temporal knowledge graphs by simultaneously modeling them in multi-curvature spaces, including hyperspherical, hyperbolic, and Euclidean spaces. IME learns both space-shared and space-specific properties to mitigate spatial gaps and comprehensively capture characteristic features across different curvature spaces.