On Matlab, how can I solve
$$\begin{array}{ll} \text{minimize} & x^tQx + b^tx + c\\ \text{subject to} & \|x\|_2 \leq C\end{array}$$
where $C > 0$?
I read about function fmincon but it only allows linear restrictions. I take it a norm restriction could be rephrased as a quadratic? But I haven't found any inbuilt functions that deal with a quadratic restriction. Is there any function that does that?
I'll assume $Q$ is symmetric positive definite. You can solve this optimization problem easily using the projected gradient method.
The objective function is $$ f(x) = x^T Q x + b^T x + c. $$ The constraint is $x \in \Omega$, where $\Omega$ is the $2$-norm ball of radius $C$ centered at the origin. We first initialize $x^0$ (I would take $x^0$ to be the zero vector). Then the projected gradient iteration is $$ x^{k+1} = P_\Omega(x^k - t \nabla f(x^k)) $$ for $k = 0, 1, 2, \ldots$. Here $t > 0$ is a step size which must be chosen to be sufficiently small in order for the algorithm to converge. The function $P_\Omega$ projects onto $\Omega$, so $P_\Omega(u)$ is the point in $\Omega$ which is closest to $u$. The gradient of $f$ is $$ \nabla f(x) = 2 Q x + b. $$
Typically one might run this algorithm for a few hundred or a few thousand iterations, or until some stopping criterion is met (for example, you could stop the iteration if $\| \nabla f(x^k) \|_2$ is less than some threshold value). When you implement this, make a
semilogyplot of the objective function value versus iteration in order to check whether or not the algorithm is converging. You should see the objective function value decreasing nicely towards a minimum value. Compare such convergence plots for various values of $t$ in order to choose a good step size $t$ that gives fast convergence.If you need an algorithm that converges faster, it is only slightly more difficult to implement a line search version of the projected gradient method (where the step size $t$ is chosen adaptively at each iteration, using a line search procedure). You could also try an accelerated proximal gradient method such as FISTA (which is also easy to implement).