Long-distance dependencies don't have to be long: Simplifying through provably (Approximately) optimal permutations

1Citations
Citations of this article
74Readers
Mendeley users who have this article in their library.

Abstract

Neural models at the sentence level often operate on the constituent words/tokens in a way that encodes the inductive bias of processing the input in a similar fashion to how humans do. However, there is no guarantee that the standard ordering of words is computationally efficient or optimal. To help mitigate this, we consider a dependency parse as a proxy for the inter-word dependencies in a sentence and simplify the sentence with respect to combinatorial objectives imposed on the sentenceparse pair. The associated optimization results in permuted sentences that are provably (approximately) optimal with respect to minimizing dependency parse lengths and that are demonstrably simpler. We evaluate our general-purpose permutations within a finetuning schema for the downstream task of subjectivity analysis. Our fine-tuned baselines reflect a new state of the art for the SUBJ dataset and the permutations we introduce lead to further improvements with a 2.0% increase in classification accuracy (absolute) and a 45% reduction in classification error (relative) over the previous state of the art.

Cite

CITATION STYLE

APA

Bommasani, R. (2019). Long-distance dependencies don’t have to be long: Simplifying through provably (Approximately) optimal permutations. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Student Research Workshop (pp. 89–99). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-2012

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free