Is it possible to solve quadratic programming problem with gradient descent?

1.8k Views Asked by At

Assume that we have this quadratic function:

$$ J = \frac {1}{2}x^TQx + c^Tx $$

And our gradient is the derivative:

$$ J_g = x^TQ + c^T $$

We want to minimize $J$ and we can do that with setting $J_g = 0$ and solve for $x^T $.

Or we can use gradient descent.

$$x^T_{k+1} = x^T_{k} +\alpha J_g(x_{k})$$

Where $\alpha > 0$ is a small number positive number.

That sounds easy. But how would I do if I want to minimize $J$ with constraints:

$$Ax \leq b $$ $$x \geq 0$$

What would I do then? What method should I use? Can I use if-statements to check when $x$ is outside of the cobstraints?

1

There are 1 best solutions below

8
On BEST ANSWER

We can reformulate the constraints as \begin{equation} \begin{pmatrix} A \\ -I \end{pmatrix} x \leq \begin{pmatrix} b \\ 0 \end{pmatrix} \end{equation} And denote $\begin{pmatrix} A \\ -I \end{pmatrix}$ by $\bar{A}$, denote $\begin{pmatrix} b \\ 0 \end{pmatrix}$ by $\bar{b}$. Thus the original problem is equivalent to \begin{equation} \begin{array}{cl} {\min} & {\frac{1}{2} x^TQx + c^T x} \\ {\text{s.t.}} & {\bar{A}x \leq \bar{b}} \end{array} \end{equation} To apply the gradient descent to this problem, we need one more step: \begin{equation} \begin{aligned} & \bar{x} = x_k - (Qx_k+c), \\ & x_{k+1} = \operatorname{proj}_{\bar{A}x \leq \bar{b}}(\bar{x}), \end{aligned} \end{equation} where $\operatorname{proj}_{\bar{A}x \leq \bar{b}}(\cdot)$ is projection operator: \begin{equation} \operatorname{proj}_{\bar{A}x \leq \bar{b}}(\bar{x}) = \arg\min_{\bar{A}x \leq \bar{b}} \|x - \bar{x}\|^2. \end{equation} The projection operator can be solved by proximal gradient method.

And one can refer to Quadratic Programming for some other methods (e.g. SQP).