Minimizing the sum of squared residuals

1k Views Asked by At

I have the equation $y=X\beta+u$, where $y \in \mathbb{R}^{n \ \times \ 1}$ , $X \in \mathbb{R}^{n \ \times (k+1)} $ , $\beta \in \mathbb{R}^{(k+1) \times \ 1 }$, and $u$ is the error term, and is a ${(k+1) \ \times \ 1 }$ matrix.

$$-2(y-X\beta)X=0\Leftrightarrow X'(y-X\beta)=0$$

Is there someone who can explain why the statement above is equivalent ?

1

There are 1 best solutions below

2
On BEST ANSWER

They are not equivalent.

Check the dimensions for the first term $(y- X\beta) \in \mathbb{R}^{n\times 1}$ but $X \in \mathbb{R}^{n \times (k+1)}$, the are not equivalent.

I think you intend to ask

$$-2(y-X\beta)'X=0 \iff X'(y-X\beta)=0 $$

This can be seen by taking transpose. recall that $(AB)'=B'A'$