On the foundations of Bayesianism

Arnborg, S.; Sjo¨din, G.
May 2001
AIP Conference Proceedings;2001, Vol. 568 Issue 1, p61
Conference Proceeding
We discuss precise assumptions entailing Bayesianism in the line of investigations started by Cox, and relate them to a recent critique by Halpern. We show that every finite model which cannot be rescaled to probability violates a natural and simple refinability principle. A new condition, separability, was found sufficient and necessary for rescalability of infinite models. We finally characterize the acceptable ways to handle uncertainty in infinite models based on Cox's assumptions. Certain closure properties must be assumed before all the axioms of ordered fields are satisfied. Once this is done, a proper plausibility model can be embedded in an ordered field containing the reals, namely either standard probability (field of reals) for a real valued plausibility model, or extended probability (field of reals and infinitesimals) for an ordered plausibility model. The end result is that if our assumptions are accepted, all reasonable uncertainty management schemes must be based on sets of extended probability distributions and Bayes conditioning.


Related Articles

  • Nonlinear dynamical factor analysis. Giannakopoulos, Xavier; Valpola, Harri // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p305 

    A general method for state space analysis is presented where not only underlying factors generating the data are estimated, but also the dynamics behind time series in factor space are modelled. The mappings and the states are all unknown. The nonlinearity of the mappings makes the problem...

  • Generalizing the Lomb-Scargle periodogram. Bretthorst, G. Larry // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p241 

    This paper is an elaboration of an issue that arose in the paper "Nonuniform Sampling: Bandwidth and Aliasing" [1]. In that paper the single frequency estimation problem was explored using Bayesian probability theory for quadrature data that were sampled nonuniformly and nonsimultaneously. In...

  • Penalized Maximum Likelihood Estimation for univariate normal mixture distributions. Ridolfi, A.; Idier, J. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p229 

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts...

  • Maxentropic interpolation by cubic splines with possibly noisy data. Gzyl, H.; Velasquez, Y. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p216 

    The gist of this note is to present a procedure for obtaining numerical solutions of Fredholm equations of the first kind of the type ∫[sup 1, sub 0]K(s, t)x(t)dt + ε(s) = y(s), where K : V → W is a linear operator mapping V = C([0, 1]),the continuous functions on [0,1], into some...

  • Image modeling and restoration—Information fusion, Set-theoretic methods, and the Maximum entropy principle. Ishwar, P.; Moulin, P. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p204 

    Several powerful, but heuristic techniques in recent image denoising literature have used overcomplete image representations. We present a general framework for fusing information from multiple representations based on fundamental statisticat estimation principles where, information about image...

  • Experimental design to maximize information. Sebastiani, P.; Wynn, H. P. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p192 

    This paper will consider different methods to measure the gain of information that an experiment provides on parameters of a statistical model. The approach we follow is Bayesian and relies on the assumption that information about model parameters is represented by their probability distribution...

  • The quantization of the attention function under a Bayes information theoretic model. Wynn, H. P.; Sebastiani, P. // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p159 

    Bayes experimental design using entropy, or equivalently negative information, as a criterion is fairly well developed. The present work applies this model but at a primitive level in statistical sampling. It is assumed that the observer/experimentor is allowed to place a window over the support...

  • Characterization of Pearsonian and bilateral power series distribution via maximum entropies. Borzadaran, G. R. Mohtashami // AIP Conference Proceedings;2001, Vol. 568 Issue 1, p145 

    The maximization of the entropy in a class of distributions subject to certain constraints has an important role in statistical theory. Kagan, Linnik & Rao (1973), Kapur (1989) and A. W. Kemp (1997) used the maximum entropy distributions and characterized a large number of discrete and...

  • Learning Naive Physics by Visual Observation: Using Qualitative Spatial Representations and Probabilistic Reasoning. Boxer, Paul A. // International Journal of Computational Intelligence & Applicatio;Sep2001, Vol. 1 Issue 3, p273 

    Autonomous robots are unsuccessful at operating in complex, unconstrained environments. They lack the ability to learn about the physical behavior of different objects through the use of vision. We combine Bayesian networks and qualitative spatial representation to learn general physical...


Read the Article


Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics