When does this least squares analytical solution based on zeros of partial derivatives start providing more than one solution?

159 Views Asked by At

If I want to fit a quadratic function of two variables to some data, I can use

$$f(x, y) = c_1 x^2 + c_2 xy + c_3 y^2 + c_4 x + c_5 y + c_6$$

$$\frac{\partial}{\partial c_i} \sum_j\left( z_j - f(x_j, y_j) \right)^2 = 0$$

to obtain six equations, and then endeavor to solve them.

I've done it for one variable not two, but I'm guessing the process is straightforward.

If I extend this to more variables, and to higher order than 2, when will the analytical expressions be at risk for having multiple solutions?

1

There are 1 best solutions below

0
On BEST ANSWER

As @ClaudeLeibovici quickly pointed out, for least squares it will always be linear.

This is one reason why least squares is/was so convenient/popular back when folks were "writing with feathers using light from burning animal fat".

For example:

$$\frac{\partial}{\partial c_1} \sum_j\left( z_j - f(x_j, y_j) \right)^2 = 2 \sum_j \left( z_j - f(x_j, y_j) \right) \frac{\partial f}{\partial c_1}$$ and since $\frac{\partial f}{\partial c_1}$ is now constant and $\left( z_j - f(x_j, y_j) \right)$ is linear in $c_1$, there will always be one solution.

The number of variables and the power to which they are raised is unimportant. It's only the power of 2 in "least squares" that we need to notice here.