In recent years, text has been the main form of communication on social media platforms such as Twitter, Reddit, Facebook, Instagram and YouTube. Emotion Recognition from these platforms can be exploited for all sorts of applications. Through the means of a review of the current literature, it was found that Transformer-based deep learning models show very promising results when trained and fine-Tuned for emotion recognition tasks. This paper provides an overview of the architecture for three of the most popular Transformer-based models, BERT Base, DistilBERT, and RoBERTa. These models are also fine-Tuned using the "Emotions"dataset; a data corpus composed of English tweets annotated in six (6) different emotions, and the performance of the models is evaluated. The results of this experiment showed that while all of the models demonstrated excellent emotion recognition capabilities by obtaining over 92% F1-score, DistilBERT could be trained in nearly half of the time compared to the other models. Thus, the use of DistilBERT for emotion recognition tasks is encouraged.
CITATION STYLE
Gomez, L. R., Watt, T., Babaagba, K. O., Chrysoulas, C., Homay, A., Rangarajan, R., & Liu, X. (2023). Emotion Recognition on Social Media Using Natural Language Processing (NLP) Techniques. In ACM International Conference Proceeding Series (pp. 113–118). Association for Computing Machinery. https://doi.org/10.1145/3625156.3625173
Mendeley helps you to discover research relevant for your work.