If you have a random sample of size n from an exponential distribution with a mean $\mu$, find a constant $c$ that minimizes the mean squared error for an estimator of $\mu$ of the form $c\Sigma X_{i}$
So this is what I gather from my textbook so far:
We know that to minimize MSE, we want to minimize this equation: $$MSE = Var(T) + [b(T)]^2$$ where $T$ is an estimator of $\tau(\theta)$
Also if $T$ is an estimator of $\tau (\theta)$, then the bias is give by $$b(T) = E(T) - \tau (\theta)$$
How would I use these resources to solve the problem?
When $X_1, \cdots, X_n$ are i.i.d. and $X_1 \sim \mathrm{Exp}(μ)$, there is$$ E_μ\left(\sum_{k = 1}^n X_k\right) = nμ, \quad D_μ\left(\sum_{k = 1}^n X_k\right) = nμ^2. $$ Therfore for any $c \in \mathbb{R}$,\begin{align*} &\mathrel{\phantom{=}} D_μ\left(c\sum_{k = 1}^n X_k\right) + \left(E_μ\left(c\sum_{k = 1}^n X_k\right) - μ\right)^2\\ &= c^2 nμ^2 + (cnμ - μ)^2 = (n(n + 1) c^2 - 2nc + 1) μ^2. \end{align*} So the MSE reaches minimum when $\displaystyle c = \frac{1}{n + 1}$.