# Bayesian updating normal distribution

*03-Nov-2017 09:39*

The more general results were obtained later by the statistician David A.Freedman who published in two seminal research papers in 19 when and under what circumstances the asymptotic behaviour of posterior is guaranteed.Jeffrey's rule, which applies Bayes' rule to the case where the evidence itself is assigned a probability.Bayesian theory calls for the use of the posterior predictive distribution to do predictive inference, i.e., to predict the distribution of a new, unobserved data point.Ian Hacking noted that traditional "Dutch book" arguments did not specify Bayesian updating: they left open the possibility that non-Bayesian updating rules could avoid Dutch books.

Later in the 1980s and 1990s Freedman and Persi Diaconis continued to work on the case of infinite countable probability spaces.

In parameterized form, the prior distribution is often assumed to come from a family of distributions called conjugate priors.

The usefulness of a conjugate prior is that the corresponding posterior distribution will be in the same family, and the calculation may be expressed in closed form.

His 1963 paper treats, like Doob (1949), the finite case and comes to a satisfactory conclusion.

However, if the random variable has an infinite but countable probability space (i.e., corresponding to a die with infinite many faces) the 1965 paper demonstrates that for a dense subset of priors the Bernstein-von Mises theorem is not applicable.

The technique is however equally applicable to discrete distributions. This can be interpreted to mean that hard convictions are insensitive to counter-evidence. The latter can be derived by applying the first rule to the event "not ", from which the result immediately follows.