How can I minimize a quadratic over a $2$-norm ball in Matlab?

162 Views Asked by At

On Matlab, how can I solve

$$\begin{array}{ll} \text{minimize} & x^tQx + b^tx + c\\ \text{subject to} & \|x\|_2 \leq C\end{array}$$

where $C > 0$?

I read about function fmincon but it only allows linear restrictions. I take it a norm restriction could be rephrased as a quadratic? But I haven't found any inbuilt functions that deal with a quadratic restriction. Is there any function that does that?

2

There are 2 best solutions below

0
On BEST ANSWER

I'll assume $Q$ is symmetric positive definite. You can solve this optimization problem easily using the projected gradient method.

The objective function is $$ f(x) = x^T Q x + b^T x + c. $$ The constraint is $x \in \Omega$, where $\Omega$ is the $2$-norm ball of radius $C$ centered at the origin. We first initialize $x^0$ (I would take $x^0$ to be the zero vector). Then the projected gradient iteration is $$ x^{k+1} = P_\Omega(x^k - t \nabla f(x^k)) $$ for $k = 0, 1, 2, \ldots$. Here $t > 0$ is a step size which must be chosen to be sufficiently small in order for the algorithm to converge. The function $P_\Omega$ projects onto $\Omega$, so $P_\Omega(u)$ is the point in $\Omega$ which is closest to $u$. The gradient of $f$ is $$ \nabla f(x) = 2 Q x + b. $$

Typically one might run this algorithm for a few hundred or a few thousand iterations, or until some stopping criterion is met (for example, you could stop the iteration if $\| \nabla f(x^k) \|_2$ is less than some threshold value). When you implement this, make a semilogy plot of the objective function value versus iteration in order to check whether or not the algorithm is converging. You should see the objective function value decreasing nicely towards a minimum value. Compare such convergence plots for various values of $t$ in order to choose a good step size $t$ that gives fast convergence.

If you need an algorithm that converges faster, it is only slightly more difficult to implement a line search version of the projected gradient method (where the step size $t$ is chosen adaptively at each iteration, using a line search procedure). You could also try an accelerated proximal gradient method such as FISTA (which is also easy to implement).

0
On

If $Q$ is symmetric positive semidefinite, this can be formulated and solved as a Second Order Cone Problem (SOCP). Undet MATLAB, this can be entered into CVX or YALMIP, which can call free solvers SeDuMi or SDPT3 to solve it.

Here is the CVX formulation:

cvx_begin
variable x
minimize(x'*Q*x+H*x+c
norm(x) <= C
cvx_end

If you install CVX, it includes the free solvers SeDuMi and SDPT3, and will solve this problem if you enter it as shown.

Note that the inclusion of $c$ does not affect the optimal value of $x$ (i.e.,, the argmin).