Abstract
Text Simplification (TS) is the task of converting a text into a form that is easier to read while maintaining the meaning of the original text. A sub-task of TS is Cognitive Simplification (CS), converting text to a form that is readily understood by people with cognitive disabilities without rendering it childish or simplistic. This sub-task has yet to be explored with neural methods in NLP, and resources for it are scarcely available. In this paper, we present a method for incorporating knowledge from the cognitive accessibility domain into a TS model, by introducing an inductive bias regarding what simplification operations to use. We show that by adding this inductive bias to a TS-trained model, it is able to adapt better to CS without ever seeing CS data, and outperform a baseline model on a traditional TS benchmark. In addition, we provide a novel test dataset for CS, and analyze the differences between CS corpora and existing TS corpora, in terms of how simplification operations are applied.
Cite
CITATION STYLE
Chamovitz, E., & Abend, O. (2022). Cognitive Simplification Operations Improve Text Simplification. In CoNLL 2022 - 26th Conference on Computational Natural Language Learning, Proceedings of the Conference (pp. 241–265). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.conll-1.17
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.