This task has no description! Would you like to contribute one?
3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in knowledge-graph-completion-1
Use these libraries to find knowledge-graph-completion-1 models and implementations
No datasets available.
No subtasks available.
A graph neural network based relation prediction framework, GraIL, that reasons over local subgraph structures and has a strong inductive bias to learn entity-independent relational semantics is proposed.
This work proposes a holistic evaluation protocol for entity representations learned via a link prediction objective, and evaluates an architecture based on a pretrained language model, that exhibits strong generalization to entities not observed during training, and outperforms related state-of-the-art methods.
DRUM is proposed, a scalable and differentiable approach for mining first-order logical rules from knowledge graphs that resolves the problem of learning probabilistic logical rules for inductive and interpretable link prediction.
A unified model for Knowledge Embedding and Pre-trained LanguagERepresentation (KEPLER), which can not only better integrate factual knowledge into PLMs but also produce effective text-enhanced KE with the strong PLMs is proposed.
This work presents GPFL, a probabilistic rule learner optimized to mine instantiated first-order logic rules from KGs, which discovers much more quality instantiated rules than existing works and improves the predictive performance of learned rules by removing overfitting rules via validation.
This work builds a Rule Hierarchy Framework (RHF), which leverages a collection of subsumption frameworks to build a proper rule hierarchy from a set of learned rules, and adapts RHF to an existing rule learner where it design and implement two methods for Hierarchical Pruning (HPMs), which utilize the generated hierarchies to remove irrelevant and redundant rules.
This study proposes a new method named RMPI which uses a novel Relational Message Passing network for fully Inductive KGC and passes messages directly between relations to make full use of the relation patterns for subgraph reasoning with new techniques on graph transformation, graph pruning, relation-aware neighborhood attention, addressing empty subgraphs, etc.
An INductive knowledge GRAph eMbedding method, InGram, that can generate embeddings of new relations as well as new entities at inference time and outperforms 14 different state-of-the-art methods on varied inductive learning scenarios.
A number of variants of a rule-based approach which are specifically aimed at addressing the issues of underperformance and interpretability are studied, and it is found that the resulting models can achieve a performance which is close to that of NBFNet.
Adding a benchmark result helps the community track progress.