Abstract
We present an Emotional Music Information Retrieval system for mobile devices that utilizes a machine learning approach to detect latent emotion from within both user queries (non-descriptive queries) and the lyrics of songs and uses both elements to develop an effective Music Information Retrieval system. Emotion is extracted from the songs and queries and mapped into a high-dimensional emotion space, which allows for the employment of conventional text retrieval techniques to calculate the similarity between a user query and the latent emotion in song lyrics, thereby producing a ranked list of songs for playback. © 2012 Springer-Verlag.
Author supplied keywords
Cite
CITATION STYLE
Zhou, L., Lin, H., & Gurrin, C. (2012). EMIR: A novel music retrieval system for mobile devices incorporating analysis of user emotion. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7131 LNCS, pp. 627–629). https://doi.org/10.1007/978-3-642-27355-1_59
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.