What does the likelihood of a model mean?

75 Views Asked by At

So, I understand the idea of likelihood when it comes to throwing dice. For example:

Denote two heads in two tosses HH. Assuming that each successive coin flip is i.i.d., then the probability of observing HH is

$$ P({HH}\mid p_{\text{H}}=0.5)=0.5^2=0.25.$$

But when it comes to models I get really confused. To calculate the Bayes factor, we can use Bayes rule to find the likelihood of model $M,$ given the data $D,$ in this way: $$P(M\mid D)= \frac{P(D\mid M)×P(M)}{P(D)}$$

But what does this mean intuitively? Likelihood of a model doesn't make sense to me.

Another question: If I have two models $M_1$ and $M_2$ and $M_1 \subseteq M_2 $ meaning that $M_2$ extend $M_1$, then we must always have that $P(M_2\mid D) \geq P(M_1\mid D)$, right? But then the probability odds $$\frac{P(M_1\mid D)}{P(M_2\mid D)} < 1$$

What do I miss? I hope you can clarify me a bit. All answers are appreciated. Thanks in advance.

1

There are 1 best solutions below

1
On

In this context the words "probability" and "likelihood" often mean two different things.

Suppose the coin turns up "heads" $100\times p\%$ of the time.

Suppose $M_1$ is the statement that $0.49<p<0.51$ and $M_2$ says $0.2<p<0.8.$ Then one must have $\Pr(M_1)\le \Pr(M_2).$

The probability, given the value of $p,$ of the outcome $HH$ is $$ \Pr(HH\mid p) = p^2. $$ If that is regarded as a function of $p,$ then it is what is called a likelihood function: $L(p) = p^2.$

Suppose $f(r)\, dr,$ for $0<r<1,$ is the probability distribution of $p.$ That implies, for example, that $\displaystyle \Pr(0.3<p<0.55) = \int_{0.3}^{0.55} f(r)\,dr,$ and similarly for other intervals. Then $$ \Pr(0.3<p<0.55\mid HH) = c\int_{0.3}^{0.55} L(r) f(r)\,dr = c\int_{0.3}^{0.55} r^2 f(r)\, dr $$ where $c$ is the normalizing constant, which satisfies $$ c\int_0^1 r^2 f(r)\, dr = 1. $$