So I came across the following example in a text I was reading.
Consider a coin with unknown heads probability $p∈[0, 1]$, and conduct the following experiment: Toss the coin once. If it is tails then stop. If it is heads then toss the coin again and then stop. (So the only possible outcomes are $T, HT, HH$) Define the random variable $X$ as the number of heads you obtain. You want to estimate $p$ on the basis of a single observation of $X$.
I am wondering how I can apply MLE method. I am interested in finding the likelihood, but am unsure how to proceed when there is only one observation of $X$. I can tell easily that $$\mathbb{P}(X=0)=(1-p)$$ $$\mathbb{P}(X=1)=p*(1-p)$$ $$\mathbb{P}(X=2)=p^2$$
My intuition tells me then that if $X=0$, we would guess $p=0$, if $X=1$, $p=.5$ and if $X=2$, $p=1$ based on what would be most likely. I am unsure if this is correct or how to show this more precisely.
Your guess is right. Formalizing it is simple: