3260 papers • 126 benchmarks • 313 datasets
Given information from a knowledge base, generate a description of this information in natural language.
(Image credit: Papersgraph)
These leaderboards are used to track progress in table-to-text-generation
Use these libraries to find table-to-text-generation models and implementations
No subtasks available.
This work builds a generation framework based on a pointer network which can copy facts from the input KB, and adds two attention mechanisms: (i) slot-aware attention to capture the association between a slot type and its corresponding slot value; and (ii) a new table position self-attention to captured the inter-dependencies among related slots.
It is suggested that the PLMs benefit from similar facts seen during pretraining or fine-tuning, such that they perform well even when the input graph is reduced to a simple bag of node and edge labels.
This paper proposes a structured graph-to-text model with a two-step fine-tuning mechanism which first fine-tunes model on Wikipedia before adapting to the graph- to-text generation, and proposes a novel tree-level embedding method to capture the inter-dependency structures of the input graph.
Adding a benchmark result helps the community track progress.