Text Simplification aims to reduce semantic complexity of text, while still retaining the semantic meaning. Recent work has started exploring neural text simplification (NTS) using the Sequence-to-sequence (Seq2seq) attentional model which achieves success in many text generation tasks. However, dealing with long-range dependencies and out-of-vocabulary (OOV) words remain the challenge of Text Simplification task. In this paper, in order to solve these problems, we propose a text simplification model that incorporates self-attention mechanism and pointer-generator network. Our experiments on Wikipedia and Simple Wikipedia aligned datasets demonstrate that our model is outperforms the baseline systems.
CITATION STYLE
Li, T., Li, Y., Qiang, J., & Yuan, Y. H. (2018). Text simplification with self-attention-based pointer-generator networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11305 LNCS, pp. 537–545). Springer Verlag. https://doi.org/10.1007/978-3-030-04221-9_48
Mendeley helps you to discover research relevant for your work.