I am working on this problem.
Find a consistent estimator for $E[X^2]$ when $X \sim \text{Exp}(\beta)$ .
So far I am thinking of using the invariant property of MLEs, so I let
$$\hat{\theta} = \bar{X}$$
And since $E[X^2]=2\beta^2$ and $E[\bar{X}^2]=\beta^2(1/n+1)$ I thought that
$$\hat{\theta_2}=2\bar{X}$$ would be better since it is now asymptotically unbiased.
Now I want to show that $\operatorname{Var}[\hat{\theta_2}] \rightarrow 0$ as $n \rightarrow \infty$ but I am stuck because I don't know what to do after I get
$$4\operatorname{Var}[\bar{X}^2]$$
May I have some help, please?
If $X_1,X_2,\ldots,X_n$ are i.i.d $\mathsf{Exp}$ with mean $\beta$, then by the law of large numbers
$$\frac{1}{n}\sum_{i=1}^n X_i^2\stackrel{P}\longrightarrow E(X_1^2)=2\beta^2$$
So a consistent estimator of $2\beta^2$ based on a sample of $n$ observations is simply $T_1=\frac{1}{n}\sum\limits_{i=1}^n X_i^2$.
Similarly, with $\overline X=\frac{1}{n}\sum\limits_{i=1}^n X_i$ we have $$\overline X\stackrel{P}\longrightarrow \beta$$
And by continuous mapping theorem, $$2\overline X^2\stackrel{P}\longrightarrow 2\beta^2$$
So regarding your idea, another consistent estimator is $T_2=2\overline X^2$.
If you wish to find the large sample variance of $T_2$, then you can use a Taylor expansion on $g(x)=2x^2$ to arrive at
$$\operatorname{Var}\left[2\overline X^2\right]\approx \operatorname{Var}\left[\overline X\right](g'(\beta))^2=\frac{16\beta^4}{n} $$