Convex quadratic program with box constraints

741 Views Asked by At

I want to solve a problem of the form

$$\begin{aligned} \arg \min_{x} \quad & {x}^{T} A x + {b}^{T} x \\ \text{subject to} \quad & l \preceq x \preceq u \end{aligned}$$

where $A$ is a positive semidefinite matrix, thus the function I'm optimizing should be convex.

However the eigenvalues of $A$ are all close to zero or zero, and on my computer due to numerical errors they might be negative zero.

I'm looking for a way to easily solve this problem numerically, right now I'm using gradient descent and repair solutions outside the box constraint by forcing them back onto the border. However I'm not sure whether that is a valid approach or not.

Which method of Numerical Optimization would one use for this kind of problem? Keeping in mind my problems with the machine precision.

2

There are 2 best solutions below

3
On

Hint: let $A_k = A+\frac{1}{k}I_n$. Solve the problem for this and in each iteration increase $k$.

0
On

One way to deal with the zero eigenvalues is to perform a 'eigenvalue clipping'. The matrix $A$ can be written as its eigenvalue decomposition $$A = Q\Lambda Q^{-1}$$ where $\Lambda$ is a diagonal matrix with the eigenvalues of $A$. Replace the zero and near-zero eigenvalues in $\Lambda$ with a positive number $\epsilon$ and call the resulting matrix $\Lambda'$. Construct a new matrix $$A' = Q\Lambda'Q^{-1}$$ $A'$ now is fully positive definite. You should be able to perform the optimization with $A'$ without problem.