Unconstrained optimization based fractional order derivative for data classification

1Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Data classification has several problems one of which is a large amount of data that will reduce computing time. The Fractional gradient descent method is an unconstrained optimization algorithm to train classifiers with support vector machines that have convex problems. Compared to the classic integer-order model, a model built with fractional calculus has a significant advantage to accelerate computing time. In this research it is to conduct a qualitative literature review in order to investigate the current state of these new optimization method fractional derivatives can be implemented in the classifier algorithm.

Cite

CITATION STYLE

APA

Hapsari, D. P., Utoyo, I., & Purnami, S. W. (2020). Unconstrained optimization based fractional order derivative for data classification. In Journal of Physics: Conference Series (Vol. 1613). IOP Publishing Ltd. https://doi.org/10.1088/1742-6596/1613/1/012066

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free