Making good probability estimates for regression

2Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper, we show that the optimisation of density forecasting models for regression in machine learning can be formulated as a multi-objective problem. We describe the two objectives of sharpness and calibration and suggest suitable scoring metrics for both. We use the popular negative log-likelihood as a measure of sharpness and the probability integral transform as a measure of calibration.To optimise density forecasting models under multiple criteria we introduce a multi-objective evolutionary optimisation framework that can produce better density forecasts from a prediction user's perspective. Our experiments show improvements over the state-of-the-art on a risk management problem. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Carney, M., & Cunningham, P. (2006). Making good probability estimates for regression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4212 LNAI, pp. 582–589). Springer Verlag. https://doi.org/10.1007/11871842_55

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free