Inductive setting of the knowledge graph completion task. This requires a model to perform link prediction on an entirely new test graph with new set of entities.
3260 papers • 126 benchmarks • 313 datasets
Inductive setting of the knowledge graph completion task. This requires a model to perform link prediction on an entirely new test graph with new set of entities.
(Image credit: Papersgraph)
These leaderboards are used to track progress in knowledge-graph-completion
No benchmarks available.
Use these libraries to find knowledge-graph-completion models and implementations
No datasets available.
No subtasks available.
A graph neural network based relation prediction framework, GraIL, that reasons over local subgraph structures and has a strong inductive bias to learn entity-independent relational semantics is proposed.
A Communicative Message Passing neural network for Inductive reLation rEasoning, CoMPILE, that reasons over local directed subgraph structures and has a vigorous inductive bias to process entity-independent semantic relations and can naturally handle asymmetric/anti-symmetric relations.
This work proposes an all-in-one solution, called BERTRL (BERT-based Relational Learning), which leverages pre-trained language model and fine-tunes it by taking relation instances and their possible reasoning paths as training samples.
The Neural Bellman-Ford Network (NBFNet) is proposed, a general graph neural network framework that solves the path formulation with learned operators in the generalized Bell man-Ford algorithm, and outperforms existing methods by a large margin in both transductive and inductive settings.
This paper considers rules as cycles and shows that the space of cycles has a unique structure based on the mathematics of algebraic topology, and builds a novel GNN framework on the collected cycles to learn the representations of cycles, and to predict the existence/non-existence of a relation.
A model MorsE is proposed, which does not learn embeddings for entities but learns transferable meta-knowledge that can be used to produce entity embeddINGS that significantly outperforms corresponding baselines for in-KGs and out-of-KG tasks in inductive settings.
The concepts of relation path coverage and relation path confidence are introduced to filter out unreliable paths prior to model training to elevate the model performance.
A novel method that captures both connections between entities and the intrinsic nature of entities, by simultaneously aggregating RElational Paths and cOntext with a unified hieRarchical Transformer framework, namely REPORT is proposed.
An n-ary subgraph reasoning framework for fully inductive link prediction (ILP) on n-ary relational facts is proposed and a novel graph structure, the n-ary semantic hypergraph, is introduced to facilitate subgraph extraction.
This work proposes Anchoring Path Sentence Transformer (APST), a search-based description retrieval method to enrich entity descriptions and an assessment mechanism to evaluate the rationality of APs, enabling comprehensive predictions and high-quality explanations.
Adding a benchmark result helps the community track progress.