Support Vector Machine optimization with fractional gradient descent for data classification

  • Hapsari D
  • Utoyo I
  • Purnami S
N/ACitations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Data classification has several problems one of which is a large amount of data that will reduce computing time. SVM is a reliable linear classifier for linear or non-linear data, for large-scale data, there are computational time constraints. The Fractional gradient descent method is an unconstrained optimization algorithm to train classifiers with support vector machines that have convex problems. Compared to the classic integer-order model, a model built with fractional calculus has a significant advantage to accelerate computing time. In this research, it is to conduct investigate the current state of this new optimization method fractional derivatives that can be implemented in the classifier algorithm. The results of the SVM Classifier with fractional gradient descent optimization, it reaches a convergence point of approximately 50 iterations smaller than SVM-SGD. The process of updating or fixing the model is smaller in fractional because the multiplier value is less than 1 or in the form of fractions. The SVM-Fractional SGD algorithm is proven to be an effective method for rainfall forecast decisions.

Cite

CITATION STYLE

APA

Hapsari, D. P., Utoyo, I., & Purnami, S. W. (2021). Support Vector Machine optimization with fractional gradient descent for data classification. Journal of Applied Sciences, Management and Engineering Technology, 2(1), 1–6. https://doi.org/10.31284/j.jasmet.2021.v2i1.1467

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free