Emousic: Emotion and Activity-Based Music Player Using Machine Learning

8Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we propose a new way of personalized music playlist generation. The mood is statistically inferred from various data sources primarily: audio, image, text, and sensors. Human’s mood is identified from facial expression and speech tones. Physical activities can be detected by sensors that humans usually carry in form of cellphones. The state-of-the-art data science techniques now make it computationally feasible to identify the actions based on very large datasets. The program learns from the data. Machine learning helps in classifying and predicting results using trained information. Using such techniques, applications can recognize or predict mood, activities for benefit to user. Emousic is a real-time mood and activity recognition use case. It is a smart music player that keeps learning your listening habits and plays the song preferred by your past habits and mood, activities, etc. It is a personalized playlist generator.

Cite

CITATION STYLE

APA

Sarda, P., Halasawade, S., Padmawar, A., & Aghav, J. (2019). Emousic: Emotion and Activity-Based Music Player Using Machine Learning. In Advances in Intelligent Systems and Computing (Vol. 924, pp. 179–188). Springer Verlag. https://doi.org/10.1007/978-981-13-6861-5_16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free