Help Proving Linear Regression Model in Matrix Notation

65 Views Asked by At

I need to show that:

$\hat \mu ^{'} \hat \mu = \hat \mu ^{'} y$

Given the matrix notation of the linear regression model: $y= X\beta + \mu$

Also given: $\hat y = X \hat \beta $ and $\hat \mu = y - X\hat \beta $

I have tried:

$(y-X \hat \beta)^{'} (y-X\hat \beta)$ = $ (y-X \hat \beta)^{'} y $

I'm not sure if there's a property of transpose that I'm not aware of but I don't see how we could be saying $(y-X \hat \beta) = y $

Are we assuming that $X\hat\beta = 0 $ ?

1

There are 1 best solutions below

1
On

We know that $$X'(y-X\hat{\beta})=0\tag{1}$$

We want to show that $$\hat{\mu}'\hat{\mu}=\hat{\mu}'y$$

which is equivalent to

$$(y-X\hat{\beta})'X\hat{\beta}=0$$

Let's take the transpose,

$$\hat{\beta}' \color{red}[X'(y-X \hat{\beta})\color{red}]=0$$

From $(1)$, we can see that the result is true.

Regarding your attempt:

For matrices, if $AB=AC$, we can't conclude that $B=C$ unless there are other conditions such as $A$ is a non-singular matrix.