I learnt that the variance of the Maximum Likelihood estimator could be calculated as :
\begin{aligned} \operatorname{var}(\theta) &=[I(\theta)]^{-1} \\ &=(-E[H(\theta)])^{-1} \\ &=\left(-E\left[\frac{\partial^{2} \ln \mathcal{L}(\theta)}{\partial \theta \partial \theta^{\prime}}\right]\right)^{-1} \end{aligned}
I want to determine how bias a biased coin is. I've flipped it 3 times and I get 3 Heads.
The likelihood of getting 3 straight heads is $\mathcal{L} = \theta^3$ where $0 \leq \theta \leq 1$
$ \ln \mathcal{L} = 3 \ln \theta $
This means that my $p_{MLE} =1.0$ where $p$ is the probability of getting a heads in a single coin flip and represents the $\theta$ that I'm trying to estimate.
Now obviously because I've only flipped it 3 times, the variance $\text{Var}(p_{MLE})$ will be relatively high. However, what is the actual numerical value of this variance?
I want to show that if I keep flipping the coin and I keep getting heads that this variance goes to zero.
From a similar old post it was found that this variance is equal to:
$$\text{Var}(p_{MLE})=\frac{p(1-p)}{n}$$
However I can't see how I can get a numerical answer for the variance of $p_{MLE}$ when we get three straight heads. I know that n=3 but for $p$ what value am I supposed to substitute it with?
$p_{MLE}$ is a random variable. More specifically, here $$p_{MLE} = p_{MLE} (X_1,X_2,X_3) = \frac{1}{3} (X_1 + X_2 + X_3)$$ where $X_i$'s are bernoulli random variables. In the sense in which you're defining your problem, there is no such thing as the variance of $p_{MLE}$ when you get three straight heads, because when you give the value of $p_{MLE}$ for 3 straight heads, it becomes a value rather than a random variable, so its variance is not defined.