I want to calculate the MLE and its consistency of Geometric distribution: $$\mathbb{P}(X=x)=p(1-p)^{x-1}$$
$$ l(p) = p(1-p)^{x_1-1}\cdot p(1-p)^{x_2-1} \cdot \cdots \cdot p(1-p)^{x_n-1} \\ L(p)=n\ln p + (x_1 + \cdots +x_n -n)\ln(1-p) \\ (L(p))'= \frac{n}{p} - \frac{x_1 + \cdots + x_n - n}{1-p} \\ \hat{p} = \frac{n}{x_1 + \cdots + x_n} = \frac{1}{\overline{X}} $$
Now, to check the consistency I would use Markov's Inequality:
$$ \mathbb{P}(|\hat{p_n} - \mathbb{E}(\hat{p_n})| \geq \epsilon) \leq \frac{\operatorname{Var}(\hat{p_n})}{\epsilon^2} $$
I am stuck on calculating Variance though. Variance of arithmetic sum would be a breeze - here though it is an inverse of it. How can i proceed? Is it just the inverse of $\operatorname{Var}(\overline{X})$ = $\frac{\operatorname{Var}(X)}{n}$ which would equal $\frac{n}{\operatorname{Var}(X)}$? What is the $\mathbb{E}(\hat{p_n})$?
It is a general fact that maximum likelihood estimators are consistent under some regularity conditions. In particular these conditions hold here because the distribution of $X$ is a member of a regular exponential family. Some discussion can be found in these lecture notes, but you can also find these results on a textbook covering asymptotic theory.
Consistency can be justified from the law of large numbers, which says
$$\overline X \stackrel{P}\longrightarrow \operatorname E\,[X_1]=\frac1p$$
By continuous mapping theorem, this implies
$$\frac1{\overline X} \stackrel{P}\longrightarrow p$$
Alternatively, you can use a Taylor expansion on moments to say that $\operatorname E\left[\frac1{\overline X}\right]\approx p$ and $\operatorname{Var}\left[\frac1{\overline X}\right]\approx \frac{p^2(1-p)}{n}$ for large $n$, so that $\operatorname E\left[\frac1{\overline X}\right]\to p$ and $\operatorname{Var}\left[\frac1{\overline X}\right]\to 0$ as $n\to \infty$. This is a sufficient condition for convergence in probability.