Abstract
The so-called pinball loss for estimating conditional quantiles is a well-known tool in both statistics and machine learning. So far, however, only little work has been done to quantify the efficiency of this tool for nonparametric approaches.We fill this gap by establishing inequalities that describe how close approximate pinball risk minimizers are to the corresponding conditional quantile. These inequalities, which hold under mild assumptions on the data-generating distribution, are then used to establish so-called variance bounds, which recently turned out to play an important role in the statistical analysis of (regularized) empirical risk minimization approaches. Finally, we use both types of inequalities to establish an oracle inequality for support vector machines that use the pinball loss. The resulting learning rates are min-max optimal under some standard regularity assumptions on the conditional quantile. © 2011 ISI/BS.
Author supplied keywords
Cite
CITATION STYLE
Steinwart, I., & Christmann, A. (2011). Estimating conditional quantiles with the help of the pinball loss. Bernoulli, 17(1), 211–225. https://doi.org/10.3150/10-BEJ267
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.