Gradient Ascent for Best Response Regression

0Citations
Citations of this article
N/AReaders
Mendeley users who have this article in their library.
Get full text

Abstract

Although regression is among the oldest areas of statistics, new approaches may still be found. One recent suggestion is Best Response Regression, where one tries to find a regression function that provides, for as many instances as possible, a better prediction than some reference regression function. In this paper we propose a new method for Best Response Regression that is based on gradient ascent rather than mixed integer programming. We evaluate our approach for a variety of noise (or error) distributions, showing that especially for heavy-tailed distributions best response regression outperforms, on unseen data, ordinary least squares regression, both w.r.t. the sum of squared errors as well as the number of instances for which better predictions are provided.

Cite

CITATION STYLE

APA

Racher, V., & Borgelt, C. (2021). Gradient Ascent for Best Response Regression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12695 LNCS, pp. 141–154). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-74251-5_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free