An Efficient v-Minimum Absolute Deviation Distribution Regression Machine

4Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Support Vector Regression (SVR) and its variants are widely used regression algorithms, and they have demonstrated high generalization ability. This research proposes a new SVR-based regressor :v -minimum absolute deviation distribution regression ( v -MADR) machine. Instead of merely minimizing structural risk, as with v -SVR, v -MADR aims to achieve better generalization performance by minimizing both the absolute regression deviation mean and the absolute regression deviation variance, which takes into account the positive and negative values of the regression deviation of sample points. For optimization, we propose a dual coordinate descent (DCD) algorithm for small sample problems, and we also propose an averaged stochastic gradient descent (ASGD) algorithm for large-scale problems. Furthermore, we study the statistical property of v -MADR that leads to a bound on the expectation of error. The experimental results on both artificial and real datasets indicate that our v -MADR has significant improvement in generalization performance with less training time compared to the widely used v -SVR, LS-SVR, varepsilon -TSVR, and linear varepsilon -SVR. Finally, we open source the code of v -MADR at https://github.com/AsunaYY/v-MADR for wider dissemination.

Cite

CITATION STYLE

APA

Wang, Y., Wang, Y., Song, Y., Xie, X., Huang, L., Pang, W., & Coghill, G. M. (2020). An Efficient v-Minimum Absolute Deviation Distribution Regression Machine. IEEE Access, 8, 85533–85551. https://doi.org/10.1109/ACCESS.2020.2992703

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free