Parallel strategy based on parameter selection of machine learning model

1Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With the improvement of hardware level and the development of artificial intelligence rapidly, machine learning algorithms are commonly used in the field of intelligent products and intelligent analysis, and have greatly improved our quality of life. Machine learning model parameters include ordinary parameters and hyper-parameters. Hyper-parameters are external configuration of the model. The values of the parameters must be manually set. The optimization of hyper-parameters is often based on a large number of parameter iterations and empirical methods. In addition, the selection range of hyper-parameters is relatively large, and it takes a lot of time to select hyper-parameter, resulting in waste of resources and expenses. This paper designs a general parallelization strategy based on hyper-parameters selection of machine learning model and implements the process of parallel hyper-parameters selection by multi-process, which can greatly reduce the running time of the program, save time and improve the development efficiency.

Cite

CITATION STYLE

APA

Qin, Y., & Ji, Z. (2019). Parallel strategy based on parameter selection of machine learning model. In Advances in Intelligent Systems and Computing (Vol. 856, pp. 1199–1206). Springer Verlag. https://doi.org/10.1007/978-3-030-00214-5_147

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free