A Linked Data Model for Multimodal Sentiment and Emotion Analysis

10Citations
Citations of this article
79Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The number of tools and services for sentiment analysis is increasing rapidly. Unfortunately, the lack of standard formats hinders interoperability. To tackle this problem, previous works propose the use of the NLP Interchange Format (NIF) as both a common semantic format and an API for textual sentiment analysis. However, that approach creates a gap between textual and sentiment analysis that hampers multimodality. This paper presents a multimedia extension of NIF that can be leveraged for multimodal applications. The application of this extended model is illustrated with a service that annotates online videos with their sentiment and the use of SPARQL to retrieve results for different modes.

Cite

CITATION STYLE

APA

Fernando Sánchez-Rada, J., Iglesias, C. A., & Gil, R. (2015). A Linked Data Model for Multimodal Sentiment and Emotion Analysis. In Proceedings of the 4th Workshop on Linked Data in Linguistics: Resources and Applications, LDL 2015 - collocated with 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing, ACL-IJCNLP 2015 (pp. 11–19). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w15-4202

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free