Question about Maximum Likelihood Estimation with single Trial

78 Views Asked by At

So I came across the following example in a text I was reading.

Consider a coin with unknown heads probability $p∈[0, 1]$, and conduct the following experiment: Toss the coin once. If it is tails then stop. If it is heads then toss the coin again and then stop. (So the only possible outcomes are $T, HT, HH$) Define the random variable $X$ as the number of heads you obtain. You want to estimate $p$ on the basis of a single observation of $X$.

I am wondering how I can apply MLE method. I am interested in finding the likelihood, but am unsure how to proceed when there is only one observation of $X$. I can tell easily that $$\mathbb{P}(X=0)=(1-p)$$ $$\mathbb{P}(X=1)=p*(1-p)$$ $$\mathbb{P}(X=2)=p^2$$

My intuition tells me then that if $X=0$, we would guess $p=0$, if $X=1$, $p=.5$ and if $X=2$, $p=1$ based on what would be most likely. I am unsure if this is correct or how to show this more precisely.

1

There are 1 best solutions below

0
On BEST ANSWER

Your guess is right. Formalizing it is simple:

  • If $X=0$, then we choose $p$ to maximize $1-p$, i.e. $p=0$
  • If $X=1$, then we choose $p$ to maximize $p(1-p)$, i.e. $p=1/2$.
  • If $X=2$, then we choose $p$ to maximize $p^2$, i.e. $p=1$