Sentiment Analysis with Contextual Embeddings and Self-attention

4Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In natural language the intended meaning of a word or phrase is often implicit and depends on the context. In this work, we propose a simple yet effective method for sentiment analysis using contextual embeddings and a self-attention mechanism. The experimental results for three languages, including morphologically rich Polish and German, show that our model is comparable to or even outperforms state-of-the-art models. In all cases the superiority of models leveraging contextual embeddings is demonstrated. Finally, this work is intended as a step towards introducing a universal, multilingual sentiment classifier.

Cite

CITATION STYLE

APA

Biesialska, K., Biesialska, M., & Rybinski, H. (2020). Sentiment Analysis with Contextual Embeddings and Self-attention. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12117 LNAI, pp. 32–41). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-59491-6_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free