Linear model $Y= X\beta +\epsilon$. Show that the two subcomponents $\hat{\beta_1}$ and $\hat{\beta_2}$ of the BLUE $\hat{\beta}$ are independent.

372 Views Asked by At

Suppose a linear model $Y= X\beta +\epsilon$ where $\epsilon \sim N(0,\sigma ^2I)$. Write $X=[X_1;X_2]$ where $X_1$ are the first $p_1$ columns of $X$ nad $X_2$ are the last $p_2$ columns. Similary split $\beta ^T =(\beta_1^T; \beta_2^T)$.

If $X_1'X_2 =0$ show that the two subcomponents $\hat{\beta_1}$ and $\hat{\beta_2}$ of the BLUE $\hat{\beta}$ are independent.

I know that $\hat{\beta_1}$ and $\hat{\beta_2}$ are normal because of $\epsilon$. I do not know how to start.

1

There are 1 best solutions below

0
On BEST ANSWER

If $X=\pmatrix{X_1 & X_2}$ then $X'=\pmatrix{ X_1' \cr X_2'}$. Calculate $\hat\beta=(X'X)^{-1}X'Y$. $$ (X'X) = \pmatrix{ X_1' \cr X_2'}\pmatrix{X_1 & X_2}=\pmatrix{X_1'X_1 & 0_{p_1,p_2} \cr 0_{p_2,p_1} & X_2'X_2} $$ Here $0_{k,m}$ is $(k\times m)$ zero matrix.

The inverse matrix to this block diagonal matrix is $$ (X'X)^{-1} =\pmatrix{(X_1'X_1)^{-1} & 0_{p_1,p_2} \cr 0_{p_2,p_1} & (X_2'X_2)^{-1}} $$ And $$\hat\beta=(X'X)^{-1}X'Y = \pmatrix{(X_1'X_1)^{-1}X_1'Y \cr (X_2'X_2)^{-1}X_2'Y}=\pmatrix{\hat\beta_1\cr \hat\beta_2}$$

Since $Y=X\beta+\epsilon$, $$\hat\beta_1=(X_1'X_1)^{-1}X_1'\epsilon+C_1$$ and $$\hat\beta_2=(X_2'X_2)^{-1}X_2'\epsilon+C_2$$ where $C_1,C_2$ are nonrandom vectors.

Vector $\hat\beta$ is normally distributed under the usual conditions. To prove independence of subvectors $\hat\beta_1$ and $\hat\beta_2$, it is sufficient to prove that cross-covariance matrix is null: $$ \mathop{Cov}(\hat\beta_1, \hat\beta_2)=\mathbb E\left[\left((X_1'X_1)^{-1}X_1'\epsilon\right)\cdot\left((X_2'X_2)^{-1}X_2'\epsilon\right)'\right] $$ $$= \mathbb E\left[(X_1'X_1)^{-1}X_1'\epsilon \epsilon' X_2 (X_2'X_2)^{-1}\right] = (X_1'X_1)^{-1}X_1'\underbrace{\mathop{\mathbb E}\left[\epsilon \epsilon'\right]}_{\sigma^2I}X_2 (X_2'X_2)^{-1} $$ $$ =\sigma^2(X_1'X_1)^{-1}\underbrace{X_1'X_2}_{0_{p_1,p_2}} (X_2'X_2)^{-1}=0_{p_1,p_2}. $$