Music Feature Classification Based on Recurrent Neural Networks with Channel Attention Mechanism

16Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

With the advancement of multimedia and digital technologies, music resources are rapidly increasing over the Internet, which changed listeners' habits from hard drives to online music platforms. It has allowed the researchers to use classification technologies for efficient storage, organization, retrieval, and recommendation of music resources. The traditional music classification methods use many artificially designed acoustic features, which require knowledge in the music field. The features of different classification tasks are often not universal. This paper provides a solution to this problem by proposing a novel recurrent neural network method with a channel attention mechanism for music feature classification. The music classification method based on a convolutional neural network ignores the timing characteristics of the audio itself. Therefore, this paper combines convolution structure with the bidirectional recurrent neural network and uses the attention mechanism to assign different attention weights to the output of the recurrent neural network at different times; the weights are assigned for getting a better representation of the overall characteristics of the music. The classification accuracy of the model on the GTZAN data set has increased to 93.1%. The AUC on the multilabel labeling data set MagnaTagATune has reached 92.3%, surpassing other comparison methods. The labeling of different music labels has been analyzed. This method has good labeling ability for most of the labels of music genres. Also, it has good performance on some labels of musical instruments, singing, and emotion categories.

Cite

CITATION STYLE

APA

Gan, J. (2021). Music Feature Classification Based on Recurrent Neural Networks with Channel Attention Mechanism. Mobile Information Systems, 2021. https://doi.org/10.1155/2021/7629994

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free