$Y_1, Y_2,...,Y_n$ is a random sample drawn from Gamma($\alpha$, 2)
(a) Find an unbiased estimator of 7 - 2$\alpha$. Verify your answer. Then find the mean square error of the estimator. Your estimator should depend on all data points.
(b) Find an unbiased estimator of $\alpha^2$. Your estimator should depend on all data points. Verify your answer.
(c) Find an unbiased estimator of $\alpha^3$. Your estimator should depend on all data points. Verify your answer.
What I have tried so far:
First I calculated $E(Y_i) = \alpha\beta = 2\alpha$, $V(Y_i) = \alpha\beta^2 = 4\alpha$
Also following from that, $E(\overline{Y}) = 2\alpha$, and $V(\overline{Y}) = \frac{4\alpha}{n}$
(a) Since the estimator should depend on all data points, I used $\overline{Y}$. I used $E(7 - \overline{Y}) = E(Y) - E(\overline{Y}) = 7 - 2\alpha$
Since this estimator is unbiased, the MSE should be equal to just variance. So I got MSE = $V(7 - \overline{Y}) = V(\overline{Y}) = \frac{4\alpha}{n}$
(b) For this, I started with $E(\overline{Y}^2)$, which I tried to derive a formula, but I'm not sure I did this part correctly. I did this:
$$E(\overline{Y}^2) = E[(\frac{1}{n}\sum_{i=0}^nY_i)^2] = \frac{1}{n^2}E[(\sum_{i=0}^nY_i)^2] = \frac{1}{n^2}\sum\sum_{i \ne j}E(Y_iY_j) = \frac{1}{n^2}[\sum_{i=0}^nE(Y_i^2) + \sum\sum_{i \ne j}E(Y_iY_j)]$$
From here, I used the identity $E(X^2) = V(X) + E(X)^2$, independence, and summation manipulation to arrive at:
$$\frac{1}{n^2}[n(4\alpha + 4\alpha^2) + (n^2 -n)4\alpha^2]$$
After some simplification, I got all of this equal to $4\alpha^2 + \frac{4\alpha}{n}$
Since this is biased with respect to $\alpha^2$, I adjusted my estimator to get $E(\frac{\overline{Y}^2}{4} - \frac{\overline{Y}}{2n})$ to get $\alpha^2$, which is unbiased.
I'm not sure this part is correct, but it seemed to make sense.
(c) This is where I ran into problems. I'm not sure how to approach this one. I tried looking in my text for some definitions and examples to help me, but I have no idea how to start. Any help would be greatly appreciated.
Observe that $Y_1\stackrel{d}{=}2W$ where $W\sim \text{Gamma}(\alpha)$ i.e. $W$ has the density $$ f(x)=\frac{1}{\Gamma(\alpha)}x^{\alpha-1}e^{-x}\quad (x>0) $$ From the density and the identity $\Gamma(\alpha+1)=\alpha\Gamma(\alpha)$, we note that $EW^d=\frac{\Gamma(\alpha+d)}{\Gamma(\alpha)}$. Then $$ EY_1^2=4EW^2=4\alpha(\alpha+1)=4\alpha^2+E(2\bar{Y})\implies E\left(\frac{Y_1^2-2\bar{Y}}{4}\right)=\alpha^2 $$ Let $T$ be the previous estimator of $\alpha^2$. Similarly, $$ E(Y_1^3)=8\alpha(\alpha+1)(\alpha+2)=8\alpha^3+24\alpha^2+16\alpha=8\alpha^3+E(24T)+E(\bar{8Y}) $$ Thus $$ E\left( \frac{Y_1^3-24T-8\bar{Y}}{8} \right)=\alpha^3 $$