A Loss Function for Classification Based on a Robust Similarity Metric

  • Singh A
  • Member S
  • 7

    Readers

    Mendeley users who have this article in their library.
  • N/A

    Citations

    Citations of this article.

Abstract

— We present a margin-based loss function for clas-sification, inspired by the recently proposed similarity measure called correntropy. We show that correntropy induces a non-convex loss function that is a closer approximation to the mis-classification loss (ideal 0-1 loss). We show that the discriminant function obtained by optimizing the proposed loss function using a neural network is insensitive to outliers and has better generalization performance as compared to using the squared loss function which is common in neural network classifiers. The proposed method of training classifiers is a practical way of obtaining better results on real world classification problems, that uses a simple gradient based online training procedure for minimizing the empirical risk.

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • Abhishek Singh

  • Student Member

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free