Probability that a coin with unknown bias comes up heads again if it came up heads 3 times in a row?

117 Views Asked by At

I've recently come across a textbook problem that does not have an answer, and was looking to see if my solution made sense. Here is the question:

A coin has an unknown probability of coming up heads selected from a uniformly random distribution $[0, 1]$. We toss the coin three times and each time it comes up heads. What is the probability that it will come up heads again?

Right now I have that:

$P(3H|p=x) = f_{3H|p=x}(x) = x^3$

$P(p=x) = f_{p}(x) = 1$

$P(3H) = \int_{0}^{1} f_{3H|p=x}(x) dx = \int_{0}^{1} x^3dx = \frac{x^4}{4}\Big|_0^1 = \frac{1}{4}$

And with Bayes' theorem we have:

$P(p=x|3H) = f_{p=x|3H}(x) = \frac{x^3 \cdot 1}{\frac{1}{4}} = 4x^3$

So

$P(H|3H) = \int_{0}^{1} x \cdot f_{p=x|3H}(x)dx = \int_{0}^{1}x \cdot 4x^3 = \frac{4}{5}\cdot x^5 \Big|_0^1 = \frac{4}{5}$

1

There are 1 best solutions below

1
On BEST ANSWER

Yes, your work is correct.

This question is a specific example of the binomial likelihood model with beta conjugate prior. Specifically, the outcomes of the coin toss are iid Bernoulli and the number of heads observed is binomial:

$$p \sim \operatorname{Beta}(a,b), \\ X_i \mid p \sim \operatorname{Bernoulli}(p), \\ Y \mid p = \sum_{i=1}^n X_i \mid p \sim \operatorname{Binomial}(n, p)$$

where the prior hyperparameters are $a = b = 1$. The posterior density of $p$ given $Y = y$ heads observed is also beta distributed: $$p \mid Y = y \sim \operatorname{Beta}(a + y, b + n - y), \\ f_{p \mid y}(p) = \frac{\Gamma(a+b+n)}{\Gamma(a+y)\Gamma(b+n-y)} p^{a+y-1} (1-p)^{b+n-y-1}.$$ In other words, we add the number of observed heads to $a$ and the number of observed tails to $b$ in order to get the posterior for $p$.

Then the posterior predictive distribution of the next observation is Bernoulli: $$\Pr[X_{\text{new}} = 1 \mid Y] = \int_{p=0}^1 p f_{p \mid y}(p) \, dp = \frac{\Gamma(a+b+n)\Gamma(a+y+1)}{\Gamma(a+y)\Gamma(a+b+n+1)} = \frac{a+y}{a+b+n},$$ and the posterior predictive distribution of the number of heads in the next $m$ coin tosses is binomial (assuming we do not use interim observations to update the prior): $$\Pr[Y_{\text{new}} = y' \mid Y = y] = \binom{m}{y'} \int_{p=0}^1 p^{y'} (1-p)^{m-y'} \, dp = \binom{m}{y'} \frac{\Gamma(a+b+n)\Gamma(a+y'+y)\Gamma(b+n-y+m-y')}{\Gamma(a+y)\Gamma(b+n-y)\Gamma(a+b+n+m)}.$$

In your case, we have $a = b = 1$, $y = n = 3$, hence $$\Pr[X_{\text{new}} = 1 \mid Y = 3] = \frac{1 + 3}{1 + 1 + 3} = \frac{4}{5},$$ as you obtained.