How is the minimization part done in projected gradient descent?

54 Views Asked by At

How is the minimization part done in projected gradient descent?

It says to do:

$$y_{k+1}=x_k-t^{(k)}\partial f(x^{(k)})$$

where $x_{k+1}=\min_{z \in \chi} \|x-z\|$

So is one supposed to solve the $x_{k+1}$ using some other minimization method? Or what is one supposed to do?

1

There are 1 best solutions below

0
On

You probably meant: $$ y_{k+1}=x_k-t^{(k)}\partial f(x^{(k)}) $$

where $x_{k+1}=\min_{z \in \chi} \|y_{k+1}-z\|$.

How to project will depend on your set $\chi$. Some projections are easy to enforce. For instance if you have to satisfy $ ||Fx||_2^2 = P $, you will have to normalize the power of your solution $x_{k+1}$ such that the conatraint above is respected.