Music mood and theme classification - A hybrid approach

64Citations
Citations of this article
93Readers
Mendeley users who have this article in their library.

Abstract

Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy. © 2009 International Society for Music Information Retrieval.

Cite

CITATION STYLE

APA

Bischoff, K., Firan, C. S., Paiu, R., Nejdl, W., Laurier, C., & Sordo, M. (2009). Music mood and theme classification - A hybrid approach. In Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR 2009 (pp. 657–662).

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free