Breaking down a conditional probability into a marginal probability

28 Views Asked by At

I was reading this blog post on using Bayesian statistics to obtain the probability of 2 heads in a row for a coin toss, given that 14 tosses resulted in 10 heads.

I am confused on how the highlighted step in the image was obtained on the page. If anyone could clarify or prove the equality, I'd really appreciate it.

Thanks!

enter image description here

1

There are 1 best solutions below

1
On BEST ANSWER

I am confused on how the highlighted step in the image was obtained on the page. If anyone could clarify or prove the equality, I'd really appreciate it.

First observe that

$$\mathbb{P}[HH,p|\text{data}]=\mathbb{P}[HH|p,\text{data}]\cdot\mathbb{P}[p|\text{data}]$$

now considering the assumption they made of conditional independence (w.r.t. the parameter $p$) of the observations you get

$$\mathbb{P}[HH,p|\text{data}]=\mathbb{P}[HH|p]\cdot\mathbb{P}[p|\text{data}]$$

At this point, to eliminate $p$ in the left member you have to integrate both members in $dp$ obtaining

$$ \bbox[5px,border:2px solid black] { \mathbb{P}[HH|\text{data}]=\int_0^1\mathbb{P}[HH|p]\cdot\mathbb{P}[p|\text{data}]dp \ } $$

Observe that inside the integral you have

$$\underbrace{\mathbb{P}[HH|p]}_{=\text{model}}\cdot\underbrace{\mathbb{P}[p|\text{data}]}_{=\text{Posterior Distribution}}$$