I want to solve a problem of the form
$$\begin{aligned} \arg \min_{x} \quad & {x}^{T} A x + {b}^{T} x \\ \text{subject to} \quad & l \preceq x \preceq u \end{aligned}$$
where $A$ is a positive semidefinite matrix, thus the function I'm optimizing should be convex.
However the eigenvalues of $A$ are all close to zero or zero, and on my computer due to numerical errors they might be negative zero.
I'm looking for a way to easily solve this problem numerically, right now I'm using gradient descent and repair solutions outside the box constraint by forcing them back onto the border. However I'm not sure whether that is a valid approach or not.
Which method of Numerical Optimization would one use for this kind of problem? Keeping in mind my problems with the machine precision.
Hint: let $A_k = A+\frac{1}{k}I_n$. Solve the problem for this and in each iteration increase $k$.