Long text qa matching model based on bigru–dattention–dssm

6Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

QA matching is a very important task in natural language processing, but current research on text matching focuses more on short text matching rather than long text matching. Compared with short text matching, long text matching is rich in information, but distracting information is frequent. This paper extracted question-and-answer pairs about psychological counseling to research long text QA-matching technology based on deep learning. We adjusted DSSM (Deep Structured Semantic Model) to make it suitable for the QA-matching task. Moreover, for better extraction of long text features, we also improved DSSM by enriching the text representation layer, using a bidirectional neural network and attention mechanism. The experimental results show that BiGRU–Dattention– DSSM performs better at matching questions and answers.

Cite

CITATION STYLE

APA

Chen, S., & Xu, T. (2021). Long text qa matching model based on bigru–dattention–dssm. Mathematics, 9(10). https://doi.org/10.3390/math9101129

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free