Model Inference and Averaging

  • Hastie T
  • Friedman J
  • Tibshirani R
N/ACitations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

For most of this book, the fitting (learning) of models has been achieved by minimizing a sum of squares for regression, or by minimizing cross-entropy for classification. In fact, both of these minimizations are instances of the maximum likelihood approach to fitting. In this chapter we provide a general exposition of the maximum likelihood approach, as well as the Bayesian method for inference. The boot-strap, introduced in Chapter 7, is discussed in this context, and its relation to maximum likelihood and Bayes is described. Finally, we present some related techniques for model averaging and improvement, including committee methods, bagging, stacking and bumping. . 2 T h 1 Bootstrap and ~laxill111lll Likelihood J\ I t hocl ' .;3. 1 A moothing E.rnmp/c The bootstrap method provides a direct computational way of assessing uncertainty, by sampling from the training data. Here we illustrate the bootstrap in a simple one-dimensional smoothing problem, and show its connection to maximum likelihood. 8 Model Inference and Averaging

Cite

CITATION STYLE

APA

Hastie, T., Friedman, J., & Tibshirani, R. (2001). Model Inference and Averaging (pp. 225–256). https://doi.org/10.1007/978-0-387-21606-5_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free