I am observing a sequence of heads and tails and trying to deduce the bias of my coin.
![]()
Let's say I observe HTH. Can I estimate the bias of my coin $p$ ? Using Bayes formula,
\begin{align} \mathbb{P}[p = x \pm \epsilon \big|\, HTH ] &=& \frac{ \mathbb{P}[HTH\big| \,p = x \pm \epsilon ]\; \mathbb{P}[p = x \pm \epsilon]}{\mathbb{P}[ HTH ]} \\
&\propto & \mathbb{P}[HTH\big|\, p = x \pm \epsilon ]\; \mathbb{P}[p = x \pm \epsilon]\end{align}
- My 3 coin flips are independent, so $\mathbb{P}[HTH]= p^2(1-p)$.
- I think we have to make assumption of the possible values of $p$, like $\mathbb{P}[p = x \pm \epsilon] = 2\epsilon $
- Given no prior observation $\mathbb{P}[ HTH ] = \tfrac{1}{8}$ ? It's not clear what to put here.
I think I am deriving the $\beta$ distribution $d\mu = x^a(1-x)^b dx $. Is there a name for the assumptions I am using here? The term might be conjugate prior.
What happened to the normalization? I get:
$$ \frac{ \mathbb{P}[HTH\big|\, p = x \pm \epsilon ]\; \mathbb{P}[p = x \pm \epsilon]}{\mathbb{P}[ HTH ]} = \frac{ x^2 (1-x) 2\epsilon }{\tfrac{1}{8}} = 16\epsilon \cdot x^2(1-x) $$
What does this factor $\boxed{16\epsilon}$ mean? How do I get the correct weight?
You will find it easier if you treat $p$ as having a continuous distribution with a prior density $\pi_0(p)$ and look for a posterior density $\pi(p|HTH)$.
Your posterior distribution would then be $$\pi(p|HTH) = \dfrac{\pi_0(p)\; \mathbb{P}(HTH|p)}{\mathbb{P}(HTH)}=\dfrac{ \pi_0(p) \; p^2(1-p)}{\displaystyle \int_{q=0}^1 \pi_0(q) \; q^2(1-q) \; dq}.$$
Using a conjugate prior of a Beta distribution makes the calculations easier (not always a good reason for choosing it): if $\pi_0(p) \sim B(\alpha,\beta)$ then $\pi(p|HTH) \sim B(\alpha+2,\beta+1)$. The mean of the posterior distribution is then $\frac{\alpha+2}{\alpha+\beta+3}$. If you started with a uniform prior distribution for $p$ (again, not always a good choice, especially not with a coin), i.e. $\alpha = \beta = 1$, then your posterior mean would be $\frac{3}{5}$.