Music emotion recognition (MER) has become an eminent field of interest in music information retrieval (MIR) group with the objective to provide more flexibility in content based music retrieval. It is quite important to categorize the music according to the emotional characteristics as it enables the users to retrieve the music according to their cognitive state. In this work, we have considered low level time-domain and spectral features extracted from the music signal. Instead of considering a wide range of features, they are judiciously considered based on our perception about the particular emotion. For classification, unsupervised approach based on K-means and Agglomerative clustering are considered. Experiment is carried out on a benchmark dataset. Performance comparison with existing work reflects the superiority of our proposed work.
CITATION STYLE
Sarkar, R., Dutta, S., Roy, A., & Saha, S. K. (2018). Emotion based categorization of music using low level features and agglomerative clustering. In Communications in Computer and Information Science (Vol. 841, pp. 506–516). Springer Verlag. https://doi.org/10.1007/978-981-13-0020-2_44
Mendeley helps you to discover research relevant for your work.