Suppose that $X_1,...,X_n$ is an iid sample that is distributed as $Gamma(4,)$ and suppose that $T = \frac{1}{4}\overline{X}$ is an estimator of $$. How do you calculate $MSE(T; )$?
I don't know how to start this problem because I know $MSE(T; )$ = $Var(T) - Bias(T; )^2$ but I don't know how to calculate $Var(T)$. I also calculated $Bias(T; )$ as follows:
$Bias(T; )$ = $E(T)$ - $$ = $E(\frac{1}{4}\overline{X})$ - $$ = .... = $\frac{1}{}$ - $$ = $\frac{1-}{}$ But I'm not sure if that calculation is even correct. Can someone confirm if it's correct as well?
First, when you say that the distribution is a $Gamma(4;\theta)$ it is not well determinated because Gamma distribution has different parametrizations.
Second when you write
I hope you mean that the random sample is "from a population with distribution"... or "where every single observation is distributed as..."
Third,
$MSE=VAR + (BIAS)^2$ and not $VAR - (BIAS)^2$ as you wrote: this looks like the great is the Bias, the best is the Estimator...
Fourth,
if the problem is only in calculating
$$V(T)=V\Bigg[\frac{1}{4}\overline{X}_n\Bigg]$$
it is simply
$$V(T)=V\Bigg[\frac{1}{4}\frac{\Sigma_i X_i}{n}\Bigg]=\frac{1}{(4n)^2}\cdot n V(X_1)$$
this depends which is the parametrization you used for the Gamma distribution. In the link you can find the two more common parametrizations
IMHO, if you use $T$ to estimate the parameter $\theta$ the parametrization that makes more sense is the one in the first column showed in the link, so that $\mathbb{E}[T]=\theta$ and you estimator is unbiased...but this is only my opinion. Only you can know the exact parametrization you have to use.