I need to show that method of moments estimator of Gamma distribution (with $\mu = \frac{\alpha}{\beta}$ and $\sigma^2 = \frac{\alpha}{\beta^2}$) is biased upwards if $\mu > 0$. I have obained an estimator for $\hat{\beta} = \frac{\bar{X}}{\bar{X^{2}}-(\bar{X})^{2}}$.
My progress: the Jensen's inequality says that $$ \mathbb{E}[g(\bar{X})] > g(\mathbb{E}[\bar{X}])$$ So, $$ \mathbb{E}\left[ \frac{\bar{X}}{\bar{X^{2}}-(\bar{X})^{2}} \right ] > \frac{\mathbb{E}[\bar{X}]}{\mathbb{E}[\bar{X^2}]-\mathbb{E}[(\bar{X})^2]} = \frac{\mu}{\sigma^2} = \beta$$
So, it is definitely biased. However, all this works if the inequality holds, but I don't know how to prove it. I know that I need to check the convexity (take the second derivative of LHS), but I'm confised by $ \bar{X^2}$. I don't know how to construct a function in order to take a derivative.
I would be very grateful if somebody direct me.
You just need to check that $$E\left[\frac{1}{\bar X ^2 - (\bar X )^2}\right]\ge \frac{1}{E(\bar X ^2 - (\bar X )^2)}$$
which is true because 1/x is convex. You can check this using
a graph, which must be curving upward
the definition of convexity, i.e. $$\frac{1}{tx_1+(1-t)x_2}\le t\frac 1{x_1}+(1-t)\frac1{x_2}$$ for $0<1<t$
We get
$$x_1x_2\text{ vs } x_1x_2[2t^2-2t+1]+t(1-t)[x_1^2+x_2^2]$$
clearly the right most term is nonnegative additionally the second term has a discriminant of $b^2-4ac=4-4(2)(1)=0$ hence it is nonnegative as well meaning the RHS is no less than the LHS, establishing convexity.
On the other hand, you are sure that $E(\bar X^2-(\bar X)^2)=\sigma^2$? If so, your result is done.