Orthogonal regressors in linear regression model

237 Views Asked by At

I am having trouble with proving one fact, which was left as an exercise on my statistics course:

in linear regression model if $X \in \mathbb{R}^{n \times p}$ is regressors matrix and is orthogonal($X^TX = I_{p}$) then $$\hat{\beta}_i = corr(x_{.i}, \text{ }y)\cdot ||y - \overline{y}\times 1_n||, \text{for $i = 2, \dotsc$}, p, $$

where $\hat \beta_i$ is coordinate number $i$ of the least squares estimator, $x_{.i}$ is the column number $i$ of X, $y$ is the dependent variable and $1_n$ is the column vector of ones of length $n$.

My attempt was to center $y$(let it be $y_C = y - \overline{y}\times 1_n$), which will only change the intercept($\hat \beta_1$), then $\hat \beta = (X^TX)^{-1}X^Ty_C = X^Ty_C$. Then $corr(x_{.i}, \text{ }y)$ is $cos$ of the angle between centered $x_{.i}$ and $y_C$, so $$<x_{.i} - \overline{x_{.i}}, y_C> = ||x_{.i} - \overline{x_{.i}}||\cdot||y_C||\cdot corr(x_{.i}, \text{ }y).$$ Also $<x_{.i} - \overline{x_{.i}}, y_C> = <x_{.i}, y_C> = (X^Ty_C)_i$, so basically I get pretty much the same equation as I should have got, but it also has $||x_{.i} - \overline{x_{.i}}||$ multiplier in it. $||x_{.i} - \overline{x_{.i}}||$ would be equal to $1$ if $\overline{x_{.i}}$ was equal to $0$, but there is nothing said about it in lecture notes. I need someone with fresh eye to look at my solution and notice the error.