Projection of a point exterior to the set on to a hyper plane via lagrangian

494 Views Asked by At

I was reading the convex optimization book from Stephen Boyd, I was trying to solve this problem of euclidean projection on a hyperplane via lagrangian but I am unable to get this solution can someone please guide through steps for solving it via lagrangian. I know there are easier methods to solve this but I want to solve it via lagrangian.

enter image description here

2

There are 2 best solutions below

0
On BEST ANSWER

Hint (outline):

First form Lagrangian $$L(x, \lambda) = \underbrace{\frac{1}{2} \| x - x_0 \|_2^2}_{ := f(x)} + \lambda \underbrace{\left( a^Tx - b \right)}_{:= g(x)} .$$

Then invoke KKT conditions:

  1. Stationarity condition: $\frac{\partial L(x,\lambda)}{\partial x} = 0$. It will give you $x = x_0 - \lambda a$.
  2. Complementary slackness: $\lambda g(x) = 0$. Since $\lambda \geq0$ (that is, dual feasibility), then based on two conditions a) $\lambda=0$, and b) $\lambda > 0$, i.e., $g(x)=0$, you can find your projection. In particular, latter condition will give you $\lambda = \frac{a^Tx_0 - b}{\|a\|_2^2}$. Now plugin your $\lambda$, it will give your projection result onto hyperplane.
2
On

Guide:

Let's solve

$$\min_{x} \|x-x_0\|^2$$

subject to $$a^Tx=b$$

The Lagrangian is $$\|x-x_0\|^2+\lambda (a^Tx-b)$$

Differentiating it gives us $$2(x-x_0)+\lambda a = 0$$

If we know $\lambda$, we can recover $x$. To recover $\lambda$, multiply $a^T$ throughout and use the information that $a^Tx=b$. Try to take it from here.