On the foundations of Bayesianism

Arnborg, S.; Sjo�din, G.
May 2001
AIP Conference Proceedings;2001, Vol. 568 Issue 1, p61
Academic Journal
We discuss precise assumptions entailing Bayesianism in the line of investigations started by Cox, and relate them to a recent critique by Halpern. We show that every finite model which cannot be rescaled to probability violates a natural and simple refinability principle. A new condition, separability, was found sufficient and necessary for rescalability of infinite models. We finally characterize the acceptable ways to handle uncertainty in infinite models based on Cox's assumptions. Certain closure properties must be assumed before all the axioms of ordered fields are satisfied. Once this is done, a proper plausibility model can be embedded in an ordered field containing the reals, namely either standard probability (field of reals) for a real valued plausibility model, or extended probability (field of reals and infinitesimals) for an ordered plausibility model. The end result is that if our assumptions are accepted, all reasonable uncertainty management schemes must be based on sets of extended probability distributions and Bayes conditioning.


Related Articles

  • Bayesian field theory and approximate symmetries. Lemm, J. C. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p425 

    Nonparametric Bayesian approaches to density estimation ("Bayesian field theories") have typically to be solved numerically on a lattice. This is often numerically quite expensive. The Paper wants to show that such numerical calculations are nowadays feasible for some interesting problem...

  • Nonlinear dynamical factor analysis. Giannakopoulos, Xavier; Valpola, Harri // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p305 

    A general method for state space analysis is presented where not only underlying factors generating the data are estimated, but also the dynamics behind time series in factor space are modelled. The mappings and the states are all unknown. The nonlinearity of the mappings makes the problem...

  • Generalizing the Lomb-Scargle periodogram. Bretthorst, G. Larry // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p241 

    This paper is an elaboration of an issue that arose in the paper "Nonuniform Sampling: Bandwidth and Aliasing" [1]. In that paper the single frequency estimation problem was explored using Bayesian probability theory for quadrature data that were sampled nonuniformly and nonsimultaneously. In...

  • Role and meaning of subjective probability: Some comments on common misconceptions. D�Agostini, G. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p23 

    Criticisms of so called 'subjective probability' come on the one hand from those who maintain that probability in physics has only a frequentistic interpretation, and, on the other, from those who tend to 'objectivise' Bayesian theory, arguing, e.g., that subjective probabilities are indeed...

  • Penalized Maximum Likelihood Estimation for univariate normal mixture distributions. Ridolfi, A.; Idier, J. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p229 

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts...

  • Maxentropic interpolation by cubic splines with possibly noisy data. Gzyl, H.; Velasquez, Y. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p216 

    The gist of this note is to present a procedure for obtaining numerical solutions of Fredholm equations of the first kind of the type ?[sup 1, sub 0]K(s, t)x(t)dt + e(s) = y(s), where K : V ? W is a linear operator mapping V = C([0, 1]),the continuous functions on [0,1], into some other Banach...

  • Image modeling and restoration�Information fusion, Set-theoretic methods, and the Maximum entropy principle. Ishwar, P.; Moulin, P. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p204 

    Several powerful, but heuristic techniques in recent image denoising literature have used overcomplete image representations. We present a general framework for fusing information from multiple representations based on fundamental statisticat estimation principles where, information about image...

  • Experimental design to maximize information. Sebastiani, P.; Wynn, H. P. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p192 

    This paper will consider different methods to measure the gain of information that an experiment provides on parameters of a statistical model. The approach we follow is Bayesian and relies on the assumption that information about model parameters is represented by their probability distribution...

  • Characterization of Pearsonian and bilateral power series distribution via maximum entropies. Borzadaran, G. R. Mohtashami // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p145 

    The maximization of the entropy in a class of distributions subject to certain constraints has an important role in statistical theory. Kagan, Linnik & Rao (1973), Kapur (1989) and A. W. Kemp (1997) used the maximum entropy distributions and characterized a large number of discrete and...


Read the Article


Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics