In the "least-squares" minimization problem, is the solution unique?

87 Views Asked by At

I want to minimize:

$$\min f(a,b)= \frac{1}{2}\sum_{i=1}^n(ax_i+b-y_i)^2 $$

I an do this by setting the gradient equal to zero which leaves me with two equations and two unknowns.

Gradient: $$\nabla f=\begin{pmatrix} \partial_a f \\ \partial_bf \end{pmatrix}=\begin{pmatrix} \sum_{i=1}^n x_i(ax_i+b-y_i) \\ \sum_{i=1}^n ax_i+b-y_i\end{pmatrix}$$

After a bit of algebra I get:

$$a=\frac{\sum_{i=1}^n y_i x_i}{\sum_{i=1}^n x_i^2} \\ b= \frac{1}{n}\left( \sum_{i=1}^n y_i-a\sum_{i=1}^n x_i\right)$$

Is this solution unique? Is there any way to see this right way or do I have to convert this to matrix form and look at the column space of $A$?

1

There are 1 best solutions below

2
On BEST ANSWER

Whether the solution is unique depends on the input data. If $x_i$ is the same for all $i$, you can set $a$ arbitrarily, and let $b=y_i - \sum_i a x_i$.

By looking at the Hessian and relating it to convexity and strict convexity, you will see that the solution is unique whenever $x_i$ is not the same for all $i$.