Data classification has several problems one of which is a large amount of data that will reduce computing time. The Fractional gradient descent method is an unconstrained optimization algorithm to train classifiers with support vector machines that have convex problems. Compared to the classic integer-order model, a model built with fractional calculus has a significant advantage to accelerate computing time. In this research it is to conduct a qualitative literature review in order to investigate the current state of these new optimization method fractional derivatives can be implemented in the classifier algorithm.
CITATION STYLE
Hapsari, D. P., Utoyo, I., & Purnami, S. W. (2020). Unconstrained optimization based fractional order derivative for data classification. In Journal of Physics: Conference Series (Vol. 1613). IOP Publishing Ltd. https://doi.org/10.1088/1742-6596/1613/1/012066
Mendeley helps you to discover research relevant for your work.