Find the Cramer-Rao lower bound for unbiased estimators of $\theta$, and then given the approximate distribution of $\hat{\theta}$ as $n$ gets large. This is for a geometric($\theta$) distribution.
I am stuck on calculating the Fisher Information, which is given by $-nE_{\theta}\left(\dfrac{d^{2}}{d\theta^{2}}\log f(X\mid\theta)\right)$. So far, I have the second derivative of the log likelihood as $\dfrac{-n}{\theta^{2}}+\dfrac{\theta(n-\sum x_{i})}{(1-\theta)^{2}}$. I just need some help finding the expectation of this.
By definition, the Fisher information $F(\theta)$ is equal to the expectation
$$F(\theta)=-\operatorname{E}_{\theta}\left[\left(\frac{\partial \ell(x,\theta)}{\partial \theta}\right)^2\right],$$
where $\theta$ is a parameter to estimate and
$$\ell(x,\theta):=\log p(x,\theta), $$
denoting by $p(x,\theta)$ the probability distribution of the given random variable $X$.
The expectation value $\operatorname{E}_{\theta}$ is taken w.r.t $p(x,\theta)$. In other words
$$F(\theta)=-\int \left(\frac{\partial \log(x,\theta)}{\partial \theta} \right)^2 p(x,\theta)\,dx$$
for a continuous random variable $X$ and similarly for discrete ones. Just use that
$$\operatorname{E}_{\theta}[f(X)]:=\sum_{k}f(k)p(k,\theta),$$ with $P_{\theta}(X=k):=p(k,\theta)$.