Deriving estimators for the parameters a and b that minimize the random error - setting up linear regression variables?

133 Views Asked by At

I'm reviewing old notes, and I know I solved this way back when, but can't remember how to know:

Consider the simple linear regression model:

$$Y_i = a + bX_i + \epsilon_i$$

where $Y_i$ is the dependent variable, $X_i$ is the independent variable, and $a$ and $b$ are model parameters, and $\epsilon_i$ is the error term. Derive the estimators for parameters a and b when trying to minimize the sum of the squared error terms.

I think the first steps are to rearrange the equation into $Y_i - a - bX_i = \epsilon_i$ and then differentiate the equation with respect to $a$ and $b$. But then I have no idea where to go from there.

Much thanks.

1

There are 1 best solutions below

3
On BEST ANSWER

Write down an expression for the squared error: $$e=\sum_i (y_i - a x_i - b)^2$$

Then, set $\frac{\partial e}{\partial a} = 0$, $\frac{\partial e}{\partial b} = 0$ and solve the resulting system of equations for $a$ and $b$. Since the expression for $e$ is convex in $a$ and $b$, this minimizes the squared error (or you can use the second derivative test).

A smarter way to do this would be to use the orthogonality principle from linear algebra.