Skip to content
Journal article

Robust regularized kernel regression

Zhu J, Hoi S, Lyu M ...see all

IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 38, issue 6 (2008) pp. 1639-1644

  • 14

    Readers

    Mendeley users who have this article in their library.
  • 26

    Citations

    Citations of this article.
  • N/A

    Views

    ScienceDirect users who have downloaded this article.
Sign in to save reference

Abstract

Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formulation for robust regularized kernel regression under the theoretical framework of regularization networks and then tackle the optimization problem directly in the primal. We show that the primal and dual approaches are equivalent to achieving similar regression performance, but the primal formulation is more efficient and easier to be implemented than the dual one. Different from previous work, our approach also optimizes the bias term. In addition, we show that the proposed solution can be easily extended to other noise-reliable loss function, including the Huber- epsilon insensitive loss function. Finally, we conduct a set of experiments on both artificial and real data sets, in which promising results show that the proposed method is effective and more efficient than traditional approaches.

Author-supplied keywords

  • Kernel regression
  • Regularized least squares (RLS)
  • Robust estimator
  • Support vector machine (SVM)

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Get full text

Authors

  • Jianke Zhu

  • Steven C H Hoi

  • Michael Rung Tsong Lyu

Cite this document

Choose a citation style from the tabs below