I have the following quadratic program
$$\min_x x^TAx \qquad \text{s.t} \quad Ax \in [a,b]^m$$
where matrix $A$ is positive semidefinite, and is similar both the objective function and in the constraint. I would like to solve this problem for large-scale input matrix $A$ and am looking for a swift solution.
Is there any specific recommendation from different optimization methods such as projection gradient, proximal gradient, trust region active set, so forth? Any comparison between these methods (and others) are appreciated.


One option which I think is promising is to formulate your problem as minimizing $f(x) + g(x) + h(Ax)$, where $f(x) = x^T A x$ and $g(x) = 0$ and $h(y) = I_{[a,b]}(y)$. (That is, $h$ is the convex indicator function of the set denoted here as $[a,b]$.) The function $f$ is differentiable and $g$ and $h$ have easy proximal operators, so you can solve this optimization problem using the PD3O method (which is a recent three-operator splitting method).
In this approach, we never have to solve a linear system involving $A$, which is nice because $A$ is large.