Structure-Aware Pre-Training for Table-to-Text Generation

18Citations
Citations of this article
56Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Table-to-text generation is a subtask of data-to-text generation which aims to generate naltural language text based on input table. Pretraining techniques have achieved great success on table-to-text generation. However, the pre-trained models used in previous works are typically trained on free-form natural language text while the input of table-to-text task is structured table. In this paper, we propose STTP, a pre-trained model that is trained with tables and their contexts. The STTP model can understand the structured input table and generate fluent text. Experiments on two datasets show the efficacy of our model.

Cite

CITATION STYLE

APA

Xing, X., & Wan, X. (2021). Structure-Aware Pre-Training for Table-to-Text Generation. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 2273–2278). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.200

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free