A hierarchical conditional attention-based neural networks for paraphrase generation

0Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Sequence-to-Sequence (Seq2Seq) learning has immense interest in recent years. The prosperous approach of end-to-end training fashion using encoder-decoder neural networks like machine translation has sprouted active research in transduction tasks such as abstractive summarization or especially Paraphrase Generation (PG). Dealing with paraphrase generation problem, one of the most intrinsic obstruction of existing solutions do not pay enough attention to the fact that words and sentences in particular context own differential importance. Consequently, the loss of crucial information probably occurs and irrelevant paraphrasing components are generated. To overcome these barriers, an emerging Hierarchical Conditional Attention-based Neural Networks (HCANN) architecture to construct end-to-end text generation framework is proposed. More specifically, included method in that represents hierarchy of document along with conditional decoder for paraphrase generation processes. Quantitative evaluation of the method on several benchmark paraphrase datasets demonstrates its efficiency and performance capability by a significant margin.

Cite

CITATION STYLE

APA

Nguyen-Ngoc, K., Le, A. C., & Nguyen, V. H. (2018). A hierarchical conditional attention-based neural networks for paraphrase generation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11248 LNAI, pp. 161–174). Springer Verlag. https://doi.org/10.1007/978-3-030-03014-8_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free