Neural methods for cross-lingual sentence compression

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Sentence compression produces a shorter sentence by removing redundant information, preserving the grammaticality and the important content. We propose an improvement to current neural deletion systems. These systems output a binary sequence of labels for an input sentence: one indicates that the token from the source sentence remains in the compression, whereas zero indicates that the token should be removed. Our main improvement is the use of a Conditional Random Field as final layer, which benefits the decoding of the best global sequence of labels for a given input. In addition, we also evaluate the incorporation of syntactic features, which can improve grammaticality. Finally, this task is extended into a cross-lingual setting where the models are evaluated on English and Portuguese. The proposed architecture achieves better than or equal results to the current state-of-the-art systems, validating that the model benefits from the modification in both languages.

Cite

CITATION STYLE

APA

Rodrigues, F., Martins, B., & Ribeiro, R. (2018). Neural methods for cross-lingual sentence compression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11089 LNAI, pp. 104–114). Springer Verlag. https://doi.org/10.1007/978-3-319-99344-7_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free