Partial Derivatives for Sum of Squared Errors

1.8k Views Asked by At

I have a simple, affine equation of the form $$ y = mx + b $$

I am using this equation to model some real-life data, and am accumulating the sum of squared errors, as per the formula:

$$ SSE(m, b) =\sum_{i=1}^n(y_i - (mx_i + b))^2 $$

I would like to optimize the parameters m and b. Are below the correct partial derivatives? I would like to preserve summation if possible.

$$ \frac{\partial}{\partial m} SSE = 2 \sum_{i=1}^n(y_i - (mx_i + b))$$ $$ \frac{\partial}{\partial b} SSE = 2 \sum_{i=1}^n(y_i - (mx_i + b))$$

1

There are 1 best solutions below

1
On BEST ANSWER

No. As specified by the chain rule of differentiation, each of the terms in the sum must first be multiplied by the relevant partial derivative of (y - m*x - b), and only then added up.

For correct answers, see end of page 1 of this PDF.