Independence of parameters in linear regression

233 Views Asked by At

$\varepsilon \sim\operatorname{N}(0,\sigma^2I_n)$

Consider adding a covariate $ \mu \in \mathbb R^n$ to get the model, $y = X\beta + \gamma\mu + \varepsilon$, and assume $X^T \mu = 0, \gamma \in \mathbb R.$ Let ($\hat\beta_\text{new}, \hat\gamma$) be the LSE of parameters ($\beta, \gamma$) in this new model.

True or False: $\hat\beta_\text{new}$ and $\hat\gamma$ are independent.

I start by saying that since the model is multivariate Normal, the covariance of these two variables will determine independence. If covariance = 0, then they are independent. So I want to show:

$$\operatorname{cov}(\hat\beta_\text{new}, \hat\gamma)=0.$$

get $\operatorname{E}(\hat\beta_\text{new}, \hat\gamma) - \operatorname{E}(\hat\beta_\text{new})\operatorname{E}(\hat\gamma) = 0$

$\operatorname{E}(\hat\beta_\text{new}) = \beta_\text{new}$ and $\operatorname{E}(\hat\gamma) = \gamma$, so want to show:

$\operatorname{E}(\hat\beta_\text{new}, \hat\gamma) = \beta_\text{new}\gamma$

Am I doing this correct?

1

There are 1 best solutions below

1
On

One of the consequences of the orthogonality of $\mu$ and the columns of $X$ is that $\widehat\beta_\text{new} = \widehat\beta.$ \begin{align} \left[ \begin{array}{cc} \widehat\beta_\text{new} \\ \widehat\gamma \end{array} \right] & = \left( \left[ \begin{array}{cc} X & \mu \end{array} \right]^\top \left[ \begin{array}{cc} X & \mu \end{array} \right] \right)^{-1} \left[ \begin{array}{cc} X & \mu \end{array} \right]^\top y \\[10pt] & = \left[ \begin{array}{cc} X^\top X, & X^\top\mu \\ \mu^\top X, & \mu^\top\mu \end{array} \right]^{-1} \left[ \begin{array}{cc} X^\top \\ \mu^\top \end{array} \right] y = \left[ \begin{array}{cc} (X^\top X)^{-1}X^\top y \\ (\mu^\top \mu)^{-1} \mu y \end{array} \right] = \left[ \begin{array}{cc} \widehat\beta \\ \widehat\gamma \end{array} \right]. \end{align} In the same way, the coefficient $\widehat\gamma$ is the same as what it would have been if $X$ had not been in the model. \begin{align} \operatorname{cov}(\widehat\beta,\widehat\gamma) & = \operatorname{cov}\left( (X^\top X)^{-1} X^\top y, (\mu^\top \mu)^{-1} \mu^\top y \right) \\[10pt] & = (X^\top X)^{-1} X^\top y \Big( \operatorname{cov}(y,y) \Big) \mu(\mu^\top\mu)^{-1} \\[12pt] \text{(since in general, } \operatorname{cov}(Au,Bv) & = A \Big(\operatorname{cov}(u,v) \Big) B^\top \text{)} \end{align} and the orthogonality of $\mu$ and the columns of $X$ shows that that is $0.$