Conditional probability paradox

29 Views Asked by At

I have coins whose probability of head $P$ follows a uniform distribution on $[0,1]$. I pick one coin and toss it. Let $H$ denote the event that I see a head. When I tried to compute $\mathbb{P}(H | P \leq p)$, I seem to get a paradox: \begin{align} \mathbb{P}(H | P \leq p) =& \frac{\mathbb{P}(H, P \leq p)}{\mathbb{P}(P \leq p)} \\ =& \frac{1}{p} \int_0^p \mathbb{P}(H, P \leq p | P = x) dx \\ =& \frac{1}{p} \int_0^p x dx \\ =& \frac{p}{2}. \end{align} However, if I do \begin{align} \mathbb{P}(H | P \leq p) =& \int_0^p \mathbb{P}(H | P \leq p , P = x) dx \\ =& \int_0^p \mathbb{P}(H | P = x) dx \\ =& \int_0^p x dx \\ =& \frac{p^2}{2}. \end{align} This seems very contradictory to me.

1

There are 1 best solutions below

0
On BEST ANSWER

In both cases you forgot (or skipped over) the density when applying the law of total expectation.

In the first one, $$P(H, P \le p) = \int_0^1 P(H, P \le p \mid P=x) f_P(x) \, dx$$ where $f_P(x)=1$ on $[0,1]$. Noting that $P(H, P \le p \mid P=x)=0$ for $x > p$ leads you to the answer you obtained.

In the second one, $$P(H \mid P \le p) = \int_0^p P(H \mid P \le p, P = x) f_{P \mid P \le p}(x) \, dx.$$ Here, $f_{P \mid P \le p}(x) = 1/p$ on $[0, p]$. This is where you were missing a factor of $p$.