Learning Music Emotions via Quantum Convolutional Neural Network

5Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Music can convey and evoke powerful emotions. But it is very challenging to recognize the music emotions accurately by computational models. The difficulty of the problem can exponentially increase when the music segments delivery multiple and complex emotions. This paper proposes a novel quantum convolutional neural network (QCNN) to learn music emotions. Inheriting the distinguished abstraction ability from deep learning, QCNN automatically extracts the music features that benefit emotion classification. The main contribution of this paper is that we utilize measurement postulate to simulate the human emotion awareness in music appreciation. Statistical experiments on the standard dataset shows that QCNN outperforms the classical algorithms as well as the state-of-the-art in the task of music emotion classification. Moreover, we provide demonstration experiment to explain the good performance of the proposed technique from the perspective of physics and psychology.

Cite

CITATION STYLE

APA

Chen, G., Liu, Y., Cao, J., Zhong, S., Liu, Y., Hou, Y., & Zhang, P. (2017). Learning Music Emotions via Quantum Convolutional Neural Network. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10654 LNAI, pp. 49–58). Springer Verlag. https://doi.org/10.1007/978-3-319-70772-3_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free