The principles of Bayesian reasoning are reviewed and applied to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions. Probability distributions (priors and likelihoods) are assigned in appropriate hypothesis spaces using the Maximum Entropy Principle, and then manipulated via Bayes’ Theorem. Bayesian hypothesis testing requires careful consideration of the prior ranges of any parameters involved, and this leads to a quantitive statement of Occam’s Razor. As an example of this general principle we offer a solution to an important problem in regression analysis; determining the optimal number of parameters to use when fitting graphical data with a set of basis functions.
CITATION STYLE
Gull, S. F. (1988). Bayesian Inductive Inference and Maximum Entropy. In Maximum-Entropy and Bayesian Methods in Science and Engineering (pp. 53–74). Springer Netherlands. https://doi.org/10.1007/978-94-009-3049-0_4
Mendeley helps you to discover research relevant for your work.