Let $ X_n $ be a sequence of iid $ \exp(\lambda) $ distributed rv´s with $ \lambda>0 $.
a) We know that $ E[X_i] = \frac{1}{\lambda} $. Show that: $ E[X_i^2] = \frac{2}{\lambda^2} $.
b) Compute $ a_n > 0 $ with the solution of a), such that the estimator $ T(X_1,...,X_n) := a_n \big( \sum_{i=1}^{n} X_i \big)^2 $ is unbiased for $ \tau(\lambda) = \frac{1}{\lambda^2} $
My idea:
to a): i use the density of the exponential distribution and then we get for the second moment
$ E[X_i^2] = \int_{-\infty}^\infty x^2 \lambda e^{-\lambda x} 1_{ \{ x\geq0 \} } dx = \lambda \int_0^\infty x^2e^{-\lambda x} dx = \lambda (-\lambda^{-1} e^{-\lambda x} x^2) \bigg\vert_0^\infty + \lambda \int_0^\infty 2x \lambda^{-1} e^{-\lambda x} dx = 2\int_0^\infty x e^{-\lambda x} dx = 2( -\lambda^{-1} e^{- \lambda x} x ) \bigg\vert_0^\infty + 2 \int_0^\infty \lambda^{-1} e^{-\lambda x} dx = \frac{2}{\lambda} ( -\lambda^{-1} e^{-\lambda x} ) \bigg\vert_0^\infty = \frac{2}{\lambda^2} $
$ \Rightarrow E[ X_i^2 ] = \frac{2}{\lambda^2} $
to b): i know that $ E[ a_n \big( \sum_{i=1}^n X_i \big)^2 ] \overset{!}= \lambda^{-2} $ must hold. Then
$ E[a_n \big( \sum_{i=1}^n X_i \big)^2] = a_n E[\big( \sum_{i=1}^n X_i \big)^2] \overset{*}= a_n (Var( \sum_{i=1}^\infty X_i) + (E(\sum_{i=1}^\infty X_i)^2 ) = a_n(\frac{n}{\lambda^2} + \frac{n^2}{\lambda^2}) = a_n(\frac{n(n+1)}{\lambda^2}) %% \sum_{i=1}^\infty E[X_i^2] = a_n \sum_{i=1}^\infty \frac{2}{\lambda^2} =a_n \frac{2n}{\lambda^2} $
$ \Rightarrow a_n = \frac{1}{n(n+1)} $.
Thus, the estimator is unbiased for $ a_n = \frac{1}{n(n+1)} $
No it is not correct.
You should say $E((\sum X_i)^2)=V(\sum X_i) + E^2(\sum X_i)$
and then you have to find $V(\sum X_i)=\sum V(X_i)= \sum [E(X_i^2)-E^2(X_i)]$
and $E^2(\sum X_i)=(\sum E(X_i))^2$.