Abstract
Paraphrase generation can be modeled as a sequence-to-sequence (Seq2Seq) learning problem. Nonetheless, a typical Seq2Seq model is liable to convey the original meaning incorrectly, as the vectorial representation of the given sentence is sometimes inadequate in recapitulating complicated semantic. Naturally, paraphrases concern the same topic, which can serve as an auxiliary guidance to promote the preservation of source semantic. Moreover, some interesting words for restatements can be derived from the topical information. To exploit topic in paraphrase generation, we incorporate topic words into the Seq2Seq framework through a topic-aware input and a topic-biased generation distribution. Direct supervision signals are also introduced to help dealing with the topic information more accurately. Empirical studies on two benchmark datasets show that the proposed method significantly improves the basic Seq2Seq model, and it is comparable with the state-of-the-art systems.
Author supplied keywords
Cite
CITATION STYLE
Liu, Y., Lin, Z., Liu, F., Dai, Q., & Wang, W. (2019). Generating paraphrase with topic as prior knowledge. In International Conference on Information and Knowledge Management, Proceedings (pp. 2381–2384). Association for Computing Machinery. https://doi.org/10.1145/3357384.3358102
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.