Learning text pair similarity with context-sensitive autoencoders

33Citations
Citations of this article
172Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a pairwise context-sensitive Autoencoder for computing text pair similarity. Our model encodes input text into context-sensitive representations and uses them to compute similarity between text pairs. Our model outperforms the state-of-the-art models in two semantic retrieval tasks and a contextual word similarity task. For retrieval, our unsupervised approach that merely ranks inputs with respect to the cosine similarity between their hidden representations shows comparable performance with the state-of-the-art supervised models and in some cases outperforms them.

Cite

CITATION STYLE

APA

Amiri, H., Resnik, P., Boyd-Graber, J., & Daumé, H. (2016). Learning text pair similarity with context-sensitive autoencoders. In 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers (Vol. 4, pp. 1882–1892). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p16-1177

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free