3260 papers • 126 benchmarks • 313 datasets
The goal of Slot Filling is to identify from a running dialog different slots, which correspond to different parameters of the user’s query. For instance, when a user queries for nearby restaurants, key slots for location and preferred food are required for a dialog system to retrieve the appropriate information. Thus, the main challenge in the slot-filling task is to extract the target entity. Source: Real-time On-Demand Crowd-powered Entity Extraction Image credit: Robust Retrieval Augmented Generation for Zero-shot Slot Filling
(Image credit: Papersgraph)
These leaderboards are used to track progress in slot-filling
Use these libraries to find slot-filling models and implementations
This work proposes a joint intent classification and slot filling model based on BERT that achieves significant improvement on intent classification accuracy, slot filling F1, and sentence-level semantic frame accuracy on several public benchmark datasets, compared to the attention-based recurrent neural network models and slot-gated models.
This work proposes an attention-based neural network model for joint intent detection and slot filling, both of which are critical steps for many speech understanding and dialog systems.
The MASSIVE dataset–Multilingual Amazon Slu resource package (SLURP) for Slot-filling, Intent classification, and Virtual assistant Evaluation is presented and modeling results on XLM-R and mT5 are presented, including exact match accuracy, intent classification accuracy, and slot- filling F1 score.
It is shown that an end-to-end dialog system based on Memory Networks can reach promising, yet imperfect, performance and learn to perform non-trivial operations and be compared to a hand-crafted slot-filling baseline on data from the second Dialog State Tracking Challenge.
This work introduces the the Schema-Guided Dialogue (SGD) dataset, containing over 16k multi-domain conversations spanning 16 domains, and presents a schema-guided paradigm for task-oriented dialogue, in which predictions are made over a dynamic set of intents and slots provided as input.
A capsule-based neural network model is proposed which accomplishes slot filling and intent detection via a dynamic routing-by-agreement schema and a re-routing schema is proposed to further synergize the slot filling performance using the inferred intent representation.
This work shows for the first time that it can learn dense representations of phrases alone that achieve much stronger performance in open-domain QA and proposes a query-side fine-tuning strategy, which can support transfer learning and reduce the discrepancy between training and inference.
A paradigm for the programmatic creation of training sets called data programming is proposed in which users express weak supervision strategies or domain heuristics as labeling functions, which are programs that label subsets of the data, but that are noisy and may conflict.
It is found that a shared dense vector index coupled with a seq2seq model is a strong baseline, outperforming more tailor-made approaches for fact checking, open-domain question answering and dialogue, and yielding competitive results on entity linking and slot filling, by generating disambiguated text.
This paper presents a novel, fully data-driven, and knowledge-grounded neural conversation model aimed at producing more contentful responses, generalizing the widely-used Sequence-to-Sequence (Seq2Seq) approach by conditioning responses on both conversation history and external “facts”, allowing the model to be versatile and applicable in an open-domain setting.
Adding a benchmark result helps the community track progress.