R-Trans: RNN Transformer Network for Chinese Machine Reading Comprehension

30Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Machine reading comprehension (MRC) has gained increasingly wide attention over the past few years. A variety of benchmark datasets have been released, which triggers the development of quite a few MRC approaches based on deep learning techniques. However, most existing models are designed to address English MRC. When applying them directly to Chinese documents, the performance often degrades considerably because of some special characteristics of Chinese, the inevitable segmentation errors in particular. In this paper, we present the RNN Transformer network to tackle the Chinese MRC task. To mitigate the influence of incorrect word segmentation and mine sequential information of whole sentences, deep contextualized word representations and bidirectional gated recurrent units networks are adopted in our model. The extensive experiments have been conducted on a very large scale Chinese MRC corpus, viz., the Les MMRC dataset. The results show that the proposed model outperforms the baseline and other prevalent MRC models notably, and established a new state-of-the-art record on the Les MMRC dataset.

Cite

CITATION STYLE

APA

Liu, S., Zhang, S., Zhang, X., & Wang, H. (2019). R-Trans: RNN Transformer Network for Chinese Machine Reading Comprehension. IEEE Access, 7, 27736–27745. https://doi.org/10.1109/ACCESS.2019.2901547

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free