Gradient descent with box constraints and possible non-convex function.

730 Views Asked by At

Hope you are well.

I am working on an optimization problem, quadratic (see below). Of the 4 variables there are but 2 that have a negativity constraint. Am I correct to say that gradient descent is my best option (my suggestion to solve this is shown below the problem at hand).

EDIT: Change in the problem: Of the 4 variables there are 2 that have a negativity constraint ($x_1$ and $x_2$), in addition all parameters are physically limited (box constraints $l<=x_i <=u$ $i= 1,...6$). Noting that some values for $w_i$ $i= 1,...6$ the Hessian can become indefinite (one eigenvalue changes sign). A convex solver (QP) is therefore In itself not robust enough, hence my suggestion for Gradient Descent (multiple random initializations and use the lowest one).

EDIT: Original question

For some values of $w_i$ $i= 1,...6$ the Hessian becomes non-(semi)definite, with eigenvalues of 0 or negative sign. This would imply that the becomes non-convex (right?). Given only one extremum, wouldn't this imply a concave function, or would it imply a saddle point? Not sure how to see the difference based on the eigenvalues of the Hessian.

Hence in these cases the optimization problem would/can have no solution, correct? Or is there (given that primarily the first eigenvalue is an issue I get confused).


$min_{x_1 -x_4}~~~~ J = J_1 + J_2 + J_3 + J_4 $

where,

$J_1 = w_1 (A-x_1 -x_2)^2 $

$J_2 = w_2 (B - x_3 - x_4)^2$

$J_3 = w_3 (C - [dx_1 - dx_2 + ex_3 -fx_4])^2$

$J_4 = w_4 x_1^2 + w_4x_2^2 + w_5 x_3^2 + w_6 x_4^2$

s.t.

EDIT: introduced box Constraint.

L <= x <= U (box constraint)

with the special case that:

$x_1 <=0$ and $x_2 <=0$