3260 papers • 126 benchmarks • 313 datasets
Relationship extraction is the task of extracting semantic relationships from a text. Extracted relationships usually occur between two or more entities of a certain type (e.g. Person, Organisation, Location) and fall into a number of semantic categories (e.g. married to, employed by, lives in).
(Image credit: Papersgraph)
These leaderboards are used to track progress in relationship-extraction-distant-supervised-21
Use these libraries to find relationship-extraction-distant-supervised-21 models and implementations
No subtasks available.
Two novel word attention models for distantly- supervised relation extraction are proposed: a Bi-directional Gated Recurrent Unit based word attention model (Bi-GRU) and a combination model which combines multiple complementary models using weighted voting method for improved relation extraction.
This paper proposes a novel model dubbed the Piecewise Convolutional Neural Networks (PCNNs) with multi-instance learning to address the problem of wrong label problem when using distant supervision for relation extraction and adopts convolutional architecture with piecewise max pooling to automatically learn relevant features.
BEX, a new bootstrapping method that protects against semantic drift by highly effective confidence assessment, is introduced by using entity and template seeds jointly (as opposed to just one as in previous work), by expanding entities and templates in parallel and in a mutually constraining fashion in each iteration and by introducing higherquality similarity measures for templates.
RESIDE is a distantly-supervised neural relation extraction method which utilizes additional side information from KBs for improved relation extraction and employs Graph Convolution Networks to encode syntactic information from text and improves performance even when limited side information is available.
A novel method, that automatically identifies relations in a sentence (sentential relation extraction) and aligns to a knowledge graph (KG) and significantly outperforms all state of the art methods on NYT Freebase and Wikidata datasets.
A new DS paradigm--document-based distant supervision, which models relation extraction as a document-based machine reading comprehension (MRC) task and design a new loss function--DSLoss (distant supervision loss), which can effectively train MRC models using only $\langle$ document, question, answer$ tuples, therefore noisy label problem can be inherently resolved.
A constraint graph is introduced to model the dependencies between relation labels and a constraint-aware attention module is designed in CGRE to integrate the constraint information to improve the noise immunity.
The KGPool method dynamically expands the context with additional facts from the KG, and learns the representation of these facts (entity alias, entity descriptions, etc.) using neural methods, supplementing the sentential context.
Adding a benchmark result helps the community track progress.