Основные понятия
A Transformer-based model with connection-biased attention and entity role embeddings can effectively perform knowledge graph completion without the need for explicit path encoding.
Аннотация
The paper proposes a model called CBLiP (Connection-Biased Link Prediction) for knowledge graph completion in the inductive setting, where the model needs to reason about entities that were not present during training.
Key highlights:
- CBLiP uses a Transformer-based subgraph encoding module with a novel connection-biased attention mechanism, eliminating the need for an expensive and time-consuming path encoding module.
- The model introduces entity roles, a simple and effective construct to represent unseen entities in a subgraph, as an alternative to conventional relative distance-based entity labeling.
- Evaluations on standard inductive knowledge graph completion benchmark datasets show that CBLiP achieves best-performing or competitive results compared to models that utilize path information.
- The effectiveness of connection-biased attention and entity role embeddings is also demonstrated in the transductive relation prediction task.
The paper argues that the connection-biased attention and entity role embeddings can implicitly capture the information about paths, distance, and shared neighborhoods, which are instrumental for correctly predicting the final relation, without the need for explicit path encoding.