Optimization problem with quadratic equality constraint

1.3k Views Asked by At

Suppose I want to minimize a convex differentiable $f(x)$ subject to $x^Tx=1$. I set up the Lagrangian: $$f(x)+\frac\lambda2(x^Tx-1).$$

Suppose the gradient of $f$ is $g$. I have $$g(x) + \lambda x=0$$

Would it be valid to do the following? $$x^Tg(x) + \lambda x^Tx = 0$$ From feasibility, $x^Tx=1$, so $\lambda=-x^Tg(x)$. Then solve for $x$ from $$g(x) - \{x^Tg(x)\} x=0$$

I'm asking because I did just that, but the result from numerical solver (R's rootSolve) violates the equality constraint $x^Tx=1$.

1

There are 1 best solutions below

0
On BEST ANSWER

Your derivation is fine, so it's ok to use your last equation

$g(x) - \{x^Tg(x)\} x=0$

(plus an additional one) and the Lagrange multiplier will do its job. The additional equation is $x^T x = 1$ which must be enforced as well, e.g. in a numerical problem solver. Generally speaking you must take the partial derivative of the first equation w.r.t. $\lambda$, which exactly reproduces the condition $x^T x = 1$.

Again, by simple counting of the number of equations to be obeyed it becomes clear that if $x$ is $N$-dimensional, then $g(x) - \{x^Tg(x)\} x=0$ constitutes $N$ equations (one too few), the same as in an unrestricted problem. However, since you have an additional constraint $x^T x = 1$ you should have to solve $N+1$ many equations as long as all components of $x$ appear independently and $\lambda$ has been solved for.

I take it that you were numerically looking for solutions which only satisfy $g(x) - \{x^Tg(x)\} x=0$, so doing only this (and disregarding $x^T x = 1$) would give you too many "solutions", amongst those were also "solutions" which violate $x^T x = 1$.

Referring to @ An aedonist's comment, the $x$ which satisfies $g(x) = 0$ then is no solution if it doesn't simultaneously satisfy $x^T x = 1$.