Is sample mean the optimal estimator of the mean of a random variable with Chi-Square distribution?

140 Views Asked by At

Assume there is a random variable $Y\sim \chi_k^2$, and $N$ independent realizations $y_i,i=1,...,N$, of $Y$.

Is the sample mean $\hat{k}=\frac{1}{N}\sum_{n=1}^{N}y_i$ the optimal estimator of $k$ (mean of $Y$)?

I tried looking for the answer in multiple textbooks, but the only example for the sample mean estimator I found was for Gaussian distribution. I know that the sample mean is the optimal estimator of the mean of a random variable with Gaussian distribution.

1

There are 1 best solutions below

1
On BEST ANSWER

Joint density of $Y_1,\ldots,Y_n$ where $Y_i$'s are i.i.d having a $\chi^2_{\theta}$ distribution is

$$f_{\theta}(y_1,\ldots,y_n)\propto \exp\left(-\frac12\sum_{i=1}^n y_i\right)\left(\prod_{i=1}^n y_i\right)^{\theta/2-1}\mathbf 1_{y_1,\ldots,y_n>0}\quad,\,\theta\in\{1,2,\ldots\}$$

By Factorization theorem, a sufficient statistic for $\theta$ is $T(Y_1,\ldots,Y_n)=\prod\limits_{i=1}^n Y_i$.

So in terms of data condensation without losing information about the unknown quantity, an estimator (optimal or not) of $\theta$ should logically be a function of $T$.

On the other hand, the method of moments estimator of $\theta$ is indeed the sample mean $\overline Y=\frac1n\sum\limits_{i=1}^n Y_i$. And as is often the case with these estimators, $\overline Y$ converges in probability to $\theta$ (so it is a consistent estimator).

However it is difficult to say which estimator is 'optimal' or 'best' without further context. An 'optimal' estimator could mean the one with minimum variance/mean squared error or something like a maximum likelihood estimator. I don't recall what these estimators are for this problem off the top of my head.