Although different studies are caried out by deep learning models for financial markets sentiment analysis, there is a lack of specific embedding method that regards the domain. Therefore, the goal of this study is to discover what type of embedding techniques along with different classification algorithms work better for the financial markets' sentiment analysis to present an optimized embedding method in the domain. In this paper we present a broad comparative study of multiple classification models trained for sentiment analysis and will improve their performance with an optimized embedding layer. We use a heterogeneous corpus of both formal (news headlines) and informal (tweets) text to increase the robustness and build the models with CBOW, GloVe, and BERT pre-trained embeddings as well as developing an optimized embedding layer to improve the results. The best results reported here are by our LSTM model with the fine-tuned embedding layer, which has an accuracy of 0.84 and a macro-average F1-score of 0.8. Our results give evidence that the fine-tuned embedding is superior to utilising pretrained CBOW, GloVe, and BERT embeddings for financial markets sentiment analysis. We train SVM, MLP, CNN, generic RNN and LSTM models by a comprehensive approach in input data and algorithms. As a result, a sentiment analysis model is presented with a robust performance for different datasets in the domain.
CITATION STYLE
Yekrangi, M., & Nikolov, N. S. (2023). Domain-Specific Sentiment Analysis: An Optimized Deep Learning Approach for the Financial Markets. IEEE Access, 11, 70248–70262. https://doi.org/10.1109/ACCESS.2023.3293733
Mendeley helps you to discover research relevant for your work.