Query2Prod2Vec Grounded Word Embeddings for eCommerce

12Citations
Citations of this article
91Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present Query2Prod2Vec, a model that grounds lexical representations for product search in product embeddings: in our model, meaning is a mapping between words and a latent space of products in a digital shop. We leverage shopping sessions to learn the underlying space and use merchandising annotations to build lexical analogies for evaluation: our experiments show that our model is more accurate than known techniques from the NLP and IR literature. Finally, we stress the importance of data efficiency for product search outside of retail giants, and highlight how Query2Prod2Vec fits with practical constraints faced by most practitioners.

Cite

CITATION STYLE

APA

Bianchi, F., Tagliabue, J., & Yu, B. (2021). Query2Prod2Vec Grounded Word Embeddings for eCommerce. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Industry Papers (pp. 154–162). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-industry.20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free