Optimization with absolute value

370 Views Asked by At

I am trying to minimize a function of the form

$\min \qquad\sum_{i}f_i(x_i)$

where

$f_i(x_i) = a_i|x_i|^{b_i}$

subject to linear equality constraints

$\omega_j - \psi_{ji}x_i = 0$

I am forced to do this since $b_i$ is a random number close to $1.5$ and a negative $x$ is non permissible. However, this causes me problems since it makes the function noncontinuosly differentiable.

Can you suggest any workaround to it?

2

There are 2 best solutions below

0
On BEST ANSWER

Reformulate the problem to $\min \{ \sum_i a_i z_i^{b_i} : z_i \geq x_i, z_i \geq -x_i, \omega_j - \psi_{ji}x_i = 0 \}$. For $a_i \geq 0$ and $b_i \geq 1$, this is a convex optimization problem that can be solved with free solvers like ipopt.

1
On

Do you really have to use the absolute value? What does the $f_i(x_i) $ mean? I could suggest a cuadratic function $f_i(x_i) = a_i x_i^{2b_i} $ but that depends on the meaning of $f_i$ and the distribution probability of $b_i$ I guess.