I am learning how to solve a norm-constrained homogeneous linear least squares problem.
min $(norm(Ax))^2$ for x such that norm(x) = 1
The problem is set up with a Lagrangian as follows:
cost = $(norm(Ax))^2$
constraint = 1 - $(norm(x))^2$
lang = $(norm(Ax))^2 + \lambda(1 - (norm(x))^2)$
However, at this point I am unsure how to take the partial derivative of the Lagrangian with respect to x. My lack of clarity stems from unfamiliarity with matrix norms, so if someone could emphasize this basic knowledge in their solution, that would be much appreciated.
If someone has any suggestions on how to make this post more general in order to benefit others, suggestions are welcomed.
Couple of Tips: