unbiased estimator for variance

96 Views Asked by At

Consider a stochastic least squares problem

$Y = \Psi \theta + \epsilon$,

where $Y \in \mathbb{R}^N$, $\Psi \in \mathbb{R}^{N \times n}$, $\theta \in \mathbb{R}^n$, $\epsilon \in \mathbb{R}^N$

$\epsilon$ is a zero mean noise with covariance $\sigma^2 I \in \mathbb{ R}^{N \times N}$.

Let $\theta^* = (\Psi' \Psi)^{-1} \Psi' Y$ be the least squares estimator.

Show that $\hat{\sigma}^2 = \frac{1}{N - n} \left (Y - \Psi \theta^* \right )'(Y - \Psi \theta^*)$ is unbiased estimator for $\sigma^2$.

My solution

I have to show that

$\mathbb{E} \left [ \frac{1}{N - n} \left (Y - \Psi \theta^* \right )'(Y - \Psi \theta^*) \right ] - \sigma^2 = 0$.

After some working, I get to:

$\mathbb{E} \left [ \frac{1}{N - n} \left (Y - \Psi \theta^* \right )'(Y - \Psi \theta^*) \right ] - \sigma^2 = \frac{1}{N - n} \left ( n \sigma^2 - \mathbb{E}[\epsilon' \Psi (\Psi' \Psi)^{-1} \Psi' \epsilon] \right ) $.

I just need to show that $\mathbb{E}[\epsilon' \Psi (\Psi' \Psi)^{-1} \Psi' \epsilon] = n \sigma^2$

Can anyone help me?

Thanks

1

There are 1 best solutions below

5
On BEST ANSWER

Set $\Psi (\Psi' \Psi)^{-1} \Psi' =: M$ and let $M_{ij}$ denote the element of the matrix M in the i-th row and j-th column.

$E(\epsilon'M\epsilon) =\sum_{i,j = 1}^{N} E(\epsilon_i M_{ij} \epsilon_j) =\sum_{i,j = 1}^{N} E(\epsilon_i\epsilon_j M_{ij}) = \sum_{i,j = 1}^{N} E(\epsilon_i\epsilon_j)M_{ij} = \sum_{i=1}^{N} E(\epsilon_i\epsilon_i)M_{ii}$

since $i \neq j \Rightarrow E(\epsilon_i\epsilon_j) = 0.$

$\sum_{i=1}^{N} E(\epsilon_i\epsilon_i)M_{ii} = \sum_{i=1}^{N} \sigma^2M_{ii} =\sigma^2tr(M). (i)$

$ tr(M) = tr(\Psi'\Psi (\Psi' \Psi)^{-1}) = tr(I_n) = n. (ii) $

(i), (ii) $\Rightarrow E(\epsilon'M\epsilon) = n\sigma^2$