Pac-bayes unleashed: Generalisation bounds with unbounded losses

20Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

Abstract

We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss functions. This extends the relevance and applicability of the PAC-Bayes learning framework, where most of the existing literature focuses on supervised learning problems with a bounded loss function (typically assumed to take values in the interval [0;1]). In order to relax this classical assumption, we propose to allow the range of the loss to depend on each predictor. This relaxation is captured by our new notion of HYPothesis-dependent rangE (HYPE). Based on this, we derive a novel PAC-Bayesian generalisation bound for unbounded loss functions, and we instantiate it on a linear regression problem. To make our theory usable by the largest audience possible, we include discussions on actual computation, practicality and limitations of our assumptions.

Cite

CITATION STYLE

APA

Haddouche, M., Guedj, B., Rivasplata, O., & Shawe-Taylor, J. (2021). Pac-bayes unleashed: Generalisation bounds with unbounded losses. Entropy, 23(10). https://doi.org/10.3390/e23101330

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free