Suppose we have $X_1,…, X_n$ as a sample of $iid$ $N(0, )$ random variables with $ \in (0, ∞)$. And I'm being asked to determine the following:
a) A sufficient statistic $T$ for $$.
b) $E(T)$
c) An unbiased estimator of $$ that is a function of $T$.
$\textbf{My Solution:}$
Suppose $s = x_1, x_2,....,x_n$
For part (a) I got $T = (T_1(s), T_2(s)) = (\sum_{i=1}^{n}x_i^2, \sum_{i=1}^{n}x_i)$ but I'm not sure if that is correct. Can someone confirm if this is correct or not?
Assuming it is correct, how do we start part(b)? I know how to do in the case of single variable (e.g. if $T = \sum_{i=1}^{n}x_i^2$ only), but T is multi-variable in this case. I don't know how to solve that. Please help.
The given density is not a general Gaussian Family but it is a particular one of Gaussian, Centered in ZERO. Thus the solution is very simple
(a) A sufficient statistic is $T=\sum_{i=1}^{n}X_i^2$
In fact the likelihood is the following
$$L(\theta)=\Bigg(\frac{1}{2\pi\theta}\Bigg)^{\frac{n}{2}}e^{-\frac{1}{2\theta}\sum_i X_i^2}$$
(b)
$$\mathbb{E}[T]=\mathbb{E}\Bigg[\sum_{i=1}^{n}X_i^2\Bigg]=n\mathbb{E}[X_1^2]=n\theta$$
(c)
$$T^*=\frac{T}{n}$$
...and you are done!