Sentence compression produces a shorter sentence by removing redundant information, preserving the grammaticality and the important content. We propose an improvement to current neural deletion systems. These systems output a binary sequence of labels for an input sentence: one indicates that the token from the source sentence remains in the compression, whereas zero indicates that the token should be removed. Our main improvement is the use of a Conditional Random Field as final layer, which benefits the decoding of the best global sequence of labels for a given input. In addition, we also evaluate the incorporation of syntactic features, which can improve grammaticality. Finally, this task is extended into a cross-lingual setting where the models are evaluated on English and Portuguese. The proposed architecture achieves better than or equal results to the current state-of-the-art systems, validating that the model benefits from the modification in both languages.
CITATION STYLE
Rodrigues, F., Martins, B., & Ribeiro, R. (2018). Neural methods for cross-lingual sentence compression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11089 LNAI, pp. 104–114). Springer Verlag. https://doi.org/10.1007/978-3-319-99344-7_10
Mendeley helps you to discover research relevant for your work.