I have derived the MLE for $p$ in the geometric distribution, $p=1/\overline X$. I now need to find its asymptotic variance. Trying to take the variance of $1/\overline X$ directly seems intractable. I do have the following facts about the variance:
$$ \operatorname{Var}(\hat \theta_\text{MLE}) = -E[\ell (\theta\mid\vec X)] \approx [-\ell''(\hat \theta_\text{MLE}\mid\vec X)]^{-1} $$
But trying to find the given expected value seems no easier. Since I’m looking for the asymptotic variance maybe the last approximation above is fine.
$$ \ell'' = \sum \frac{\partial^2}{\partial \theta^2}\ln f(X_i \mid \theta) $$
$$ = \sum \frac{\partial^2}{\partial \theta^2} \ln [(1-\theta)^{X_i-1} \theta] = \sum \frac{\partial^2}{\partial\theta^2}[(X_i-1)\ln(1-\theta)+\ln\theta] $$
$$ = \sum \frac{\partial}{\partial \theta} \left[\frac{-(X_i-1)}{1-\theta} + \frac{1}{\theta}\right] = \sum \frac{X_i-1}{(1-\theta)^2}-\frac{1}{\theta^2}$$
But after all this I don’t think it’s going anywhere productive. At least I don’t see how I would find the sum as $n\rightarrow \infty$.
I also have that
$$ \sqrt{nI(\theta_0)}(\hat\theta-\theta_0) $$
tends to a standard normal distribution, but I have no sense of what it tells me or whether it tells me anything useful for this problem. Besides that, computing the information seems to require computing a sum that is equally mysterious as the one above.
You are almost there.
Asymptotic variance of $\hat{\theta}_\text{MLE}$ is
$$\frac{1}{nI(\theta)}=\frac{1}{-n\mathbb{E}\Big\{\frac{\partial^2}{\partial \theta^2}\log f(x\mid\theta)\Big\}}$$
where
$$\frac{\partial^2}{\partial \theta^2}\log f(x\mid\theta)=-\frac{1}{\theta^2}-\frac{X-1}{(1-\theta)^2}$$
And knowing that $\mathbb{E}[X]=\frac{1}{\theta}$ you immediately get
$$-n\mathbb{E}\Bigg\{\frac{\partial^2}{\partial \theta^2}\log f(x \mid \theta)\Bigg\} = \frac{n}{\theta^2(1-\theta)}$$
Thus, asymptotically,
$$\mathbb{V}[\hat{\theta}]= \frac{\theta^2(1-\theta)} {n} $$