Given a coin with an unknown bias and the observation of $N$ heads and $0$ tails, what is expected probability that the next flip is a head?
i want to solve with MLE, not Bayesian analysis.
My attempt:
For any value of p , the probability of k Heads in n tosses is given by
$\binom{n}{k} p^k \left ( 1-p \right )^{n-k}$
Consider the maximization problem:
$\frac{\partial p^k \binom{n}{k} (1-p)^{n-k}}{\partial p}=0$
$\hat{p}=\frac{k}{n}$
and I'm stuck here. Thank you.
Answer: $\frac{n+1}{n+2}$
we observe $k=N$ heads in $N$ trials and want to determine the unknown probability $p$ and the accuracy of the estimate. The maximum likelihood estimate is the value of $p$ giving the largest probability for the observed data.
$0\leq p\leq 1$
$Unif(0, 1)$ and the beta distribution where $α = 1$, $β = 1$ in our case.
Lets find the posterior distribution of p for a prior, π(p) ∼ Beta(α, β)
$\pi (p)=\frac{1}{B(\alpha ,\beta )}p^{\alpha -1}\left ( 1-p \right )^{\beta -1}$
$f(k|p)=\binom{n}{k}p^{k}(1-p)^{n-k}$
$f(p|k)=\frac{f(k|p)}{f_{K}k}\pi(p)$
$\propto p^{k+\alpha -1}(1-p)^{n-k+\beta -1}$
Based on this we can see that $f(p|k)$ has a Beta(k+α, n−k+β) distribution.
so $\beta=1$ and $\alpha=1$
the likelihood is proportional to the beta distribution, with parameters $k+1$ and $n-k+1$.
In our problem $N$ Heads in $N$ tosses. N=k
This partial derivative is $0$ at the maximum likelihood estimates;
$\frac{\partial p^{k+1} (1-p)^{n-k+1}}{\partial p}=0$
$p=\frac{k+1}{n+2}$
$n=k$ $\Rightarrow$ $p=\frac{n+1}{n+2}$
(Note that it isn’t necessary to find $f_K(k)$ explicitly and we can ignore the normalizing constants of both the Likelihood and Prior.)