High-Accuracy Clothing and Style Classification via Multi-Feature Fusion

4Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

In recent years, the online selection of virtual clothing styles has been used to explore and expand diversified personal aesthetics, and it is also an overall reform and challenge to the clothing industry. Under the condition of the existing clothing style categories, this paper puts forward a style classification method combining fine-grained and coarse-grained techniques. Furthermore, a new deep neural network is proposed, which can improve the robustness of recognition and avoid the interference of image background through the pan learning and the background learning of image features. In order to study the relationship between the fine-grained attributes of clothing and the whole style, firstly, the clothing types are learned to realize the pre-training of model parameters. Secondly, through the transfer learning of the first stage of the pre-training model parameters, the model parameters are fine-tuned to make them more suitable for identifying the coarse-grained style types. Finally, a network structure based on the dual attention mechanism is proposed to improve the accuracy of final identification by adding different attention mechanisms at different stages of the network to enhance the performance of network features. In the experiment, we collected 50,000 images of 10 clothing styles to train and evaluate the models. The results show that the proposed classification method can effectively distinguish clothing styles and types.

Cite

CITATION STYLE

APA

Chen, X., Deng, Y., Di, C., Li, H., Tang, G., & Cai, H. (2022). High-Accuracy Clothing and Style Classification via Multi-Feature Fusion. Applied Sciences (Switzerland), 12(19). https://doi.org/10.3390/app121910062

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free