Solving Norm-Constrained Homogeneous Linear Least Squares

486 Views Asked by At

I am learning how to solve a norm-constrained homogeneous linear least squares problem.

min $(norm(Ax))^2$ for x such that norm(x) = 1

The problem is set up with a Lagrangian as follows:

cost = $(norm(Ax))^2$
constraint = 1 - $(norm(x))^2$

lang = $(norm(Ax))^2 + \lambda(1 - (norm(x))^2)$

However, at this point I am unsure how to take the partial derivative of the Lagrangian with respect to x. My lack of clarity stems from unfamiliarity with matrix norms, so if someone could emphasize this basic knowledge in their solution, that would be much appreciated.

If someone has any suggestions on how to make this post more general in order to benefit others, suggestions are welcomed.

1

There are 1 best solutions below

0
On

Couple of Tips:

  1. $||Ax||^2=x^T(A^TA)x$. Try to see yourself how this comes out. Now see the wiki for the derivative you are looking for.
  2. If you are not particular on the lagrangian route for this problem, be informed that $$\min_{||x||_2=1}x^TBx\,=\,\lambda_{min}(B)$$ for all symmetric matrices $B$ and $\lambda_{min}(.)$ is the least eigenvalue. This belongs to the class of Courant-Fischer Min-Max principles.