On the foundations of Bayesianism

Arnborg, S.; Sjo¨din, G.
May 2001
AIP Conference Proceedings;2001, Vol. 568 Issue 1, p61
Academic Journal
We discuss precise assumptions entailing Bayesianism in the line of investigations started by Cox, and relate them to a recent critique by Halpern. We show that every finite model which cannot be rescaled to probability violates a natural and simple refinability principle. A new condition, separability, was found sufficient and necessary for rescalability of infinite models. We finally characterize the acceptable ways to handle uncertainty in infinite models based on Cox's assumptions. Certain closure properties must be assumed before all the axioms of ordered fields are satisfied. Once this is done, a proper plausibility model can be embedded in an ordered field containing the reals, namely either standard probability (field of reals) for a real valued plausibility model, or extended probability (field of reals and infinitesimals) for an ordered plausibility model. The end result is that if our assumptions are accepted, all reasonable uncertainty management schemes must be based on sets of extended probability distributions and Bayes conditioning.


Related Articles

  • Der Rabe und der Bayesianist. Siebel, Mark // Journal for General Philosophy of Science;2004, Vol. 35 Issue 2, p313 

    The Raven and the Bayesian. As an essential benefit of their probabilistic account of confirmation, Bayesians state that it provides a twofold solution to the ravens paradox. It is supposed to show that (i) the paradox’s conclusion is tenable because a white shoe only negligibly confirms...

  • Probability distributions of molecular observables computed from Markov models. II. Uncertainties in observables and their time-evolution. Chodera, John D.; Noé, Frank // Journal of Chemical Physics;9/14/2010, Vol. 133 Issue 10, p105102 

    Discrete-state Markov (or master equation) models provide a useful simplified representation for characterizing the long-time statistical evolution of biomolecules in a manner that allows direct comparison with experiments as well as the elucidation of mechanistic pathways for an inherently...

  • Volume measurement of cryogenic deuterium pellets by Bayesian analysis of single shadowgraphy images. Szepesi, T.; Kálvin, S.; Kocsis, G.; Lang, P. T.; Wittmann, C. // Review of Scientific Instruments;Mar2008, Vol. 79 Issue 3, p033501 

    In situ commissioning of the Blower-gun injector for launching cryogenic deuterium pellets at ASDEX Upgrade tokamak was performed. This injector is designed for high repetitive launch of small pellets for edge localised modes pacing experiments. During the investigation the final injection...

  • Statistical techniques in high energy physics. Lyons, Louis // AIP Conference Proceedings;2001, Vol. 583 Issue 1, p31 

    Rather than attempting to cover a wide range of statistical problems. I shall concentrate on four topics: 1) The argument between Bayesians and Frequentists, 2) A paradox in comparing data with two hypotheses; 3) The CLs method used in the search for the Higgs at CERN, 4) The MLBZ method of...

  • Measured responses to quantum Bayesianism. Hobson, Art // Physics Today;Dec2012, Vol. 65 Issue 12, p11 

    The author expresses his views about a response to a commentary by David Mermin on quantum Bayesianism. The author says that the Bayesian degree of belief interpretation can be ideal for mixed states in quantum physics. A study by Matthew Pusey and colleagues claims that any quantum system has...

  • Principal Component Analysis and Bayesian Classifier Based Character Recognition. Gupta, Gopal Krishna // AIP Conference Proceedings;2004, Vol. 707 Issue 1, p465 

    Extensive research has been done on character recognition using the Bayesian Classifier. This paper discusses another approach to character recognition that combines Principal Component Analysis (PCA) and the Bayesian Classifier. PCA extracts the unique information from the feature set of the...

  • Computing Bayes Factors Using Thermodynamic Integration. Lartillot, Nicolas; Philippe, Hervé // Systematic Biology;Apr2006, Vol. 55 Issue 2, p195 

    In the Bayesian paradigm, a common method for comparing two models is to compute the Bayes factor, defined as the ratio of their respective marginal likelihoods. In recent phylogenetic works, the numerical evaluation of marginal likelihoods has often been performed using the harmonic mean...

  • The product t density distribution arising from the product of two Student’s t PDFs. Nadarajah, Saralees // Statistical Papers;Jun2009, Vol. 50 Issue 3, p605 

    The Student’s t distribution has become increasingly prominent and is considered as a competitor to the normal distribution. Motivated by real examples in Physics, decision sciences and Bayesian statistics, a new t distribution is introduced by taking the product of two Student’s t...

  • From Quantum Interference to Bayesian Coherence and Back Round Again. Fuchs, Christopher A.; Schack, Rüdiger // AIP Conference Proceedings;3/10/2009, Vol. 1101 Issue 1, p260 

    In the quantum Bayesian understanding of quantum states being developed by the authors and collaborators, the Born Rule cannot be interpreted as a rule for setting measurement-outcome probabilities from an objective quantum state. But if not, what is the role of the rule? In this paper, we argue...

  • Sensitivity Study on Availability of I&C Components Using Bayesian Network. Ur, Rahman Khalil; Jinsoo Shin; Zubair, Muhammad; Gyunyoung Heo; Hanseong Son // Science & Technology of Nuclear Installations;2013, p1 

    The objective of this study is to find out the impact of instrumentation and control (I&C) components on the availability of I&C systems in terms of sensitivity analysis using Bayesian network. The analysis has been performed on I&C architecture of reactor protection system.The analysis results...


Read the Article


Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics