ContextualizedWord representations for reading comprehension

20Citations
Citations of this article
152Readers
Mendeley users who have this article in their library.

Abstract

Reading a document and extracting an answer to a question about its content has attracted substantial attention recently. While most work has focused on the interaction between the question and the document, in this work we evaluate the importance of context when the question and document are processed independently. We take a standard neural architecture for this task, and show that by providing rich contextualized word representations from a large pre-trained language model as well as allowing the model to choose between contextdependent and context-independent word representations, we can obtain dramatic improvements and reach performance comparable to state-of-the-art on the competitive SQUAD dataset.

Cite

CITATION STYLE

APA

Salant, S., & Berant, J. (2018). ContextualizedWord representations for reading comprehension. In NAACL HLT 2018 - 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference (Vol. 2, pp. 554–559). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/n18-2088

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free