Clarifying Bayes' Rule

100 Views Asked by At

Looking through the literature dedicated to Bayes factors I noticed that given formulae are quite unclear from the point of Kolmogorov probability axioms.

E.g. consider the following expression from Wikipedia: $$ \Pr (M \mid D) = \dfrac{\Pr (D \mid M) \Pr(M)}{\Pr(D)} \ , $$ where $D$ stands for our sampled data, $M$ - for the model.

Well, $D$ is a random variable. But how should I interpret $M$? In applications, e.g. solving a hierarchial probabilistic model I can consider $\Pr (D \mid M)$ as the conditional density under given parameter. But the concept of $\Pr(M)$ seems rather misleading to me.

Are there any papers with formal definitions based on axiomatic approach?

1

There are 1 best solutions below

2
On

Since a model of the distribution of a data set isn't in any strict sence an event, then $\mathsf P(\mathcal M)$ isn't strictly the "probabity that the model is true" in terms of a sigma-algebra interpretation.   (The development of Bayes' probability theory preceeded measure theory.)   The prior is more sensibly interpreted as a measure of "our expectation for the truth of the model", or "our confidence in the model."

Assigning values to the prior is somewhat problematic, which is why Bayes' factor is a comparator that cancels the term.