Question answering using hierarchical attention on top of bert features

4Citations
Citations of this article
72Readers
Mendeley users who have this article in their library.

Abstract

Machine Comprehension (MC) tests the ability of the machine to answer a question about a given passage. It requires modeling complex interactions between the passage and the question. Recently, attention mechanisms have been successfully extended to machine comprehension. In this work, the question and passage are encoded using BERT language embeddings to better capture the respective representations at a semantic level. Then, attention and fusion are conducted horizontally and vertically across layers at different levels of granularity between question and paragraph. Our experiments were performed on the datasets provided in MRQA shared task 2019.

Cite

CITATION STYLE

APA

Osama, R., El-Makky, N., & Torki, M. (2019). Question answering using hierarchical attention on top of bert features. In MRQA@EMNLP 2019 - Proceedings of the 2nd Workshop on Machine Reading for Question Answering (pp. 191–195). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d19-5825

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free