Find the unbiased estimator for the parameter $\sigma$.

51 Views Asked by At

TASK: Let $(X_1, X_2)$ is a random i.i.d. sample from $N(0, \sigma ^2)$ distribution. Find the unbiased estimator for the parameter $\sigma$.

SOLUTION: The estimator $\theta$ is unbiased if $E\theta = \theta$. In this case $\theta = \sigma ^2$, and $\sigma^2 = \frac{1}{2}( X_1^2 + X_2^2$).

So $E\sigma^2 = E\frac{1}{2}(X_1^2+X_2^2) = \frac{1}{2}(EX_1^2 + EX_2^2)$

$EX_i^2 = VarX_i + (EX_i^2) = VarX_i + 0 = \sigma^2$

And $E\sigma^2 = \sigma^2$ so this estimator is unbiased.

QUESTION That's it?. I don't understand how we choose $\sigma^2$ Thank you for any help.

1

There are 1 best solutions below

3
On

The idea is that you should recognize the formula for sample variance, $s^2$, and use the guess that the sample variance is an unbiased estimator of the population variance, $\sigma^2$.

First thing to note: it's more appropriate to use different letters for the estimator versus the thing we are trying to estimate--here I'm using $s$ for the estimator and $\sigma$ for the population parameter we are trying to estimate.

I'm hypothesizing that the sample variance $s^2 = \frac{1}{2}(X_1^2 + X_2^2)$ is an unbiased of the population variance $\sigma^2$. A few facts we will use:

  1. We know that $Var(X_1) = Var(X_2) = \sigma^2$, since they are from the same population.

  2. We know that the variance formula is $Var(X_1) = E(X_1^2) + E(X_1)^2 = E(X_1^2)$ since $E(X_1) = 0$. It's similar to show that $Var(X_2) = E(X_2^2)$.

Then, using all the information I prove unbiasedness.

$$ \begin{align*} E(s^2) &= E\left(\frac{1}{2}(X_1^2 + X_2^2)\right) \\ &= \frac{1}{2}(E(X_1^2) + E(X_2^2)) \\ &= \frac{1}{2}(Var(X_1) + Var(X_2)) \\ &= \frac{1}{2}(\sigma^2 + \sigma^2) \\ &= \frac{1}{2}(2 \sigma^2) \\ &= \sigma^2 \end{align*} $$

Since we showed the expected value of our estimator, $E(s^2)$, is equal to the population parameter we are trying to estimate, $\sigma^2$, we have proved unbiasedness.