3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in graph-representation-learning
Use these libraries to find graph-representation-learning models and implementations
Experimental results show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.
A graph neural network based relation prediction framework, GraIL, that reasons over local subgraph structures and has a strong inductive bias to learn entity-independent relational semantics is proposed.
A novel knowledge graph embedding model, Hierarchy-Aware Knowledge Graph Embedding (HAKE), which maps entities into the polar coordinate system and significantly outperforms existing state-of-the-art methods on benchmark datasets for the link prediction task.
The extensive experiments show that the directly keeping track of negative triplets with large scores with cache can gain significant improvement on various KG embedding models, and outperform the state-of-the-arts negative sampling methods based on GAN.
This paper explores the suitability of using a knowledge graph embedding approach for ecotoxicological effect prediction, and results show that the knowledge graph based approach improves the selected baselines.
This paper proposes CompGCN, a novel Graph Convolutional framework which jointly embeds both nodes and relations in a relational graph and leverages a variety of entity-relation composition operations from Knowledge Graph Embedding techniques and scales with the number of relations.
This work defines the Interstellar as a recurrent neural architecture search problem for the short-term and long-term information along the relational paths, and proposes to search for recurrent architecture as the Interstellar for different KG tasks.
Inspired by generative adversarial networks (GANs), this framework uses one knowledge graph embedding model as a negative sample generator to assist the training of the desired model, which acts as the discriminator in GANs.
The Multi-partition Embedding Interaction iMproved beyond block term format (MEIM) model is introduced, with independent core tensor for ensemble effects and soft orthogonality for max-rank mapping, in addition to multi-partitions embedding.
Adding a benchmark result helps the community track progress.