Solving a quadratic optimization problem

91 Views Asked by At

For given $ a \in \Bbb{R}^n , b \in \Bbb{R}$ solve the following optimization problem: $$\min \sum_{i=1}^n x_i^2 $$ subject to: $${a}^Tx \ge b , x \ge 0 $$ note: This is a homework so a hint is most welcome.

  1. $L(x,\lambda) = \sum_{i=1}^n x_i^2 + \lambda({a}^Tx -b) $
  2. Now setting the derivatives of the first term to $0$ leads me to $\forall i$ $ 2x_i = \lambda_ia_i $ but I cannot proceed now.

According to KKT conditons following has to hold $\forall_i$

  1. $2x_i + \lambda a_i x_i \le 0$
  2. $ x_i \ge0$
  3. $x_i(2x_i + \lambda a_ix_i)=0$

Now, Either $x_i =0$ or $\lambda= -2/a_i$ But this seems like I am making a mistake somewhere...

1

There are 1 best solutions below

8
On BEST ANSWER

Hint: You have to compliment your FOC of the Lagrangian with a complementary slackness condition.


The complementary slackness condition for your problem is $$ \lambda(a^Tx-b)=0$$ so either $\lambda=0$ (i.e., the constraint does not bind) or $\lambda>0$ and $(a^Tx-b)=0$ (i.e., the constraint binds).

Edit:

For the case $\lambda>0$ you have a system on $N+1$ linear equations in $N+1$ variables $\{\{x_i\}_{i=1}^{n},\lambda\}$. Solve this system:

\begin{align} 2 x_1 - \lambda a_1&=0 \\ \vdots \\ 2 x_n - \lambda a_n&=0 \\ \sum_{i=1}^{n}a_i x_i &= b \end{align}

Try to compute first $\lambda$ and then plug the value you find for it to FOCs.