P-value change for dropping orthogonal explanatory variable

61 Views Asked by At

Consider a linear model $X\beta + \epsilon$, where $E(\epsilon) = 0$ and a fixed deterministic $n\times p$ design matrix $X$. $\beta = (\beta_1, ..., \beta_p)^T$, $rank(X) = p$
Explain wheter the claim is true or not:
If all explanatory variables are orthogonal (uncorrelated), dropping one explanatory variable will not change the remaining p-values

The solution in our exercise says the claim is true. Can anyone give insights why this might be true? (if it is)

1

There are 1 best solutions below

0
On BEST ANSWER

Your solutions manual is wrong. For orthogonal design, the values of $\beta_i$ will not change after dropping a variable. However, your estimate for $\sigma$, namely $\hat{\sigma}$, will, as will the number of degrees of freedom of the $t$-distributions you use to test your coefficients. This will change the $p$-values you observe, as @daniel shows in his simulation.