Given $X$ a random variable in a Gamma distribution, $f(x ; \alpha,\beta)$, and:
$E(X) = \alpha \beta$
$Var(X) = \alpha \beta^2$
$\hat \alpha = $$\bar X \over \beta$
$\hat \beta = $$\frac {n \bar X^2} {\sum_{i=1}^n(X_i - \bar X)^2} $
Show that $\hat \alpha$ and $\hat \beta$ are consistent estimators.
I've spent pages and pages trying to work these into:
$$ E ((\hat \theta - \theta)^2) \to 0 \space as \space n \to \infty $$
e.g. trying to solve $ E ((\hat \beta - \beta)^2) \to 0 $ as $ n \to \infty $
but end up with pages of work often ending up with functions dependent on the other estimator, still not tending to 0 as $ n \to \infty$.
Is there any advice for solving these that somebody can give me? I feel like I'm missing a key trick.
For the $\hat{\alpha}$ part it should be quite straight forward. For $$\hat{\beta} = \frac {\bar{X}^2} {\displaystyle \frac {1} {n} \sum_{i=1}^n (X_i - \bar{X})^2}$$ you may apply continuous mapping theorem: Note that both sample mean and sample variance are consistent estimator, and thus you can claim both the numerator and denominator converge to something in probability, and then apply the continuous mapping theorem to combine the result.