Aspect-based sentiment analysis (ABSA) is a fine-grained type of sentiment analysis; it works on an aspect level. It mainly focuses on extracting aspect terms from text or reviews, categorizing the aspect terms, and classifying the sentiment polarities toward each aspect term and aspect category. Aspect term extraction (ATE) and aspect category detection (ACD) are interdependent and closely associated tasks. However, the majority of the current literature on Arabic aspect-based sentiment analysis (ABSA) deals with these tasks individually, assumes that aspect terms are already identified, or employs a pipeline model. Pipeline solutions employ single models for each task, where the output of the ATE model is utilized as the input for the ACD model. This sequential process can lead to the propagation of errors across different stages, as the performance of the ACD model is influenced by any errors produced by the ATE model. Therefore, the primary objective of this study was to investigate a multi-task learning approach based on transfer learning and transformers. We propose a multi-task learning model (MTL) that utilizes the pre-trained language model (AraBERT), namely, the MTL-AraBERT model, for extracting Arabic aspect terms and aspect categories simultaneously. Specifically, we focused on training a single model that simultaneously and jointly addressed both subtasks. Moreover, this paper also proposes a model integrating AraBERT, single pair classification, and BiLSTM/BiGRU that can be applied to aspect term polarity classification (APC) and aspect category polarity classification (ACPC). All proposed models were evaluated using the SemEval-2016 annotated dataset for the Arabic hotel dataset. The experiment results of the MTL model demonstrate that the proposed models achieved comparable or better performance than state-of-the-art works (F1-scores of 80.32% for the ATE and 68.21% for the ACD). The proposed SPC-BERT model demonstrated high accuracy, reaching 89.02% and 89.36 for APC and ACPC, respectively. These improvements hold significant potential for future research in Arabic ABSA.
CITATION STYLE
Fadel, A., Saleh, M., Salama, R., & Abulnaja, O. (2024). MTL-AraBERT: An Enhanced Multi-Task Learning Model for Arabic Aspect-Based Sentiment Analysis. Computers, 13(4). https://doi.org/10.3390/computers13040098
Mendeley helps you to discover research relevant for your work.