Error vs residual in simple linear regression

45 Views Asked by At

In my textbook the following definition were presented

LS method

However, it was presented previously that regression definition

Thus if we were to minimise the sum of squared residuals, shouldn't we be minimising $Q=\sum_{i=1}^n(y_i-\hat{y})^2=\sum_{i=1}^n(y_i-\hat{\beta_0}-\hat{\beta_1}x_i)^2$?

1

There are 1 best solutions below

0
On

You may understand in this way that $\beta_0$ and $\beta_1$, they denote a solution of $Q$, while $\hat{\beta}_0$ and $\hat{\beta}_1$, they are an optimal solution for the minimization problem $Q$.