Sentiment Analysis of Sentence-Level using Dependency Embedding and Pre-trained BERT Model

  • Ruskanda F
  • Setiawan S
  • Aditama N
  • et al.
N/ACitations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Sentiment analysis is a valuable field of research in NLP with many applications. Dependency tree is one of the language features that can be utilized in this field. Dependency embedding, as one of the semantic representations of a sentence, has shown to provide more significant results compared to other embeddings, which makes it a potential way to improve the performance of sentiment analysis tasks. This study aimed to investigate the effect of dependency embedding on sentence-level sentiment analysis through experimental research. The study replaced the Vocabulary Graph embedding in the VGCN-BERT sentiment classification system architecture with several dependency embedding representations, including word vector, context vector, average of word and context vectors, weighting on word and context vectors, and merging of word and context vectors. The experiments were conducted on two datasets, SST-2 and CoLA, with more than 19 thousand labeled sentiment sentences. The results indicated that dependency embedding can enhance the performance of sentiment analysis at the sentence level.

Cite

CITATION STYLE

APA

Ruskanda, F. Z., Setiawan, S. S. Y., Aditama, N., & Khodra, M. L. (2023). Sentiment Analysis of Sentence-Level using Dependency Embedding and Pre-trained BERT Model. PIKSEL : Penelitian Ilmu Komputer Sistem Embedded and Logic, 11(1), 171–180. https://doi.org/10.33558/piksel.v11i1.6938

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free