Bayesian Inductive Inference and Maximum Entropy

  • Gull S
N/ACitations
Citations of this article
125Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The principles of Bayesian reasoning are reviewed and applied to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions. Probability distributions (priors and likelihoods) are assigned in appropriate hypothesis spaces using the Maximum Entropy Principle, and then manipulated via Bayes’ Theorem. Bayesian hypothesis testing requires careful consideration of the prior ranges of any parameters involved, and this leads to a quantitive statement of Occam’s Razor. As an example of this general principle we offer a solution to an important problem in regression analysis; determining the optimal number of parameters to use when fitting graphical data with a set of basis functions.

Cite

CITATION STYLE

APA

Gull, S. F. (1988). Bayesian Inductive Inference and Maximum Entropy. In Maximum-Entropy and Bayesian Methods in Science and Engineering (pp. 53–74). Springer Netherlands. https://doi.org/10.1007/978-94-009-3049-0_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free