3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in knowledge-graph-embeddings-1
No benchmarks available.
Use these libraries to find knowledge-graph-embeddings-1 models and implementations
No subtasks available.
A novel knowledge graph embedding model, Hierarchy-Aware Knowledge Graph Embedding (HAKE), which maps entities into the polar coordinate system and significantly outperforms existing state-of-the-art methods on benchmark datasets for the link prediction task.
ConvE, a multi-layer convolutional network model for link prediction, is introduced, and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across all datasets.
Inspired by generative adversarial networks (GANs), this framework uses one knowledge graph embedding model as a negative sample generator to assist the training of the desired model, which acts as the discriminator in GANs.
This work presents an end-to-end approach that takes unstructured textual input and generates structured output compliant with a given vocabulary and proposes an encoder-decoder framework with an attention mechanism that leverages knowledge graph embeddings.
This work reduces the impact of false negative supervision by adopting a pretrained one-hop embedding model to estimate the reward of unobserved facts and counter the sensitivity to spurious paths of on-policy RL by forcing the agent to explore a diverse set of paths using randomly generated edge masks.
This work introduces a class of hyperbolic KG embedding models that simultaneously capture hierarchical and logical patterns in KGs and observes that different geometric transformations capture different types of relations while attention- based transformations generalize to multiple relations.
It is shown that using knowledge graph embeddings can increase the accuracy of effect prediction with neural networks and implement a fine-tuning architecture which adapts the knowledge graphembeddings to the effect prediction task and leads to a better performance.
TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces by first projecting entities from entity space to corresponding relation space and then building translations between projected entities.
MTransE, a translation-based model for multilingual knowledge graph embeddings, is proposed to provide a simple and automated solution to achieve cross-lingual knowledge alignment and explore how MTransE preserves the key properties of its monolingual counterpart.
Adding a benchmark result helps the community track progress.