Abstract
Large pre-trained neural models have recently shown remarkable progress in text generation. In this paper, we propose to generate text conditioned on the structured data (table) and a prefix (the written text) by leveraging the pretrained models. We present a new data-to-text dataset, Table with Written Text (TWT), by repurposing two existing datasets: ToTTo and TabFact. TWT contains both factual and logical statements that are faithful to the structured data, aiming to serve as a useful benchmark for controlled text generation. Compared with existing data-to-text task settings, TWT is more intuitive, the prefix (usually provided by the user) controls the topic of the generated text. Existing methods usually output hallucinated text that is not faithful on TWT. Therefore, we design a novel approach with table-aware attention visibility and copy mechanism over the table. Experimental results show that our approach outperforms state-of-the-art methods under both automatic and human evaluation metrics.
Cite
CITATION STYLE
Li, T., Fang, L., Lou, J. G., & Li, Z. (2021). TWT: Table with Written Text for Controlled Data-to-Text Generation. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 1244–1254). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.107
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.