optimization of a non-differentiable, component-wise step function

1.2k Views Asked by At

I would like to estimate the (local) minimum of a function $c:R^N \mapsto R^+$ where:

  • $c$ is only differentiable almost everywhere,
  • there exists a component $j$, such that $\frac{\partial c}{\partial x_j}$ is $0$ almost everywhere. Or in other terms, the contribution of component $j$ to $c(x) \in R^+$ is a step-wise function.

What do you suggest as an optimization algorithm that I can try for such a function ?

1

There are 1 best solutions below

1
On

Take a look at the DirectSearch package of Maple. It uses four derivative-free optimization methods.The default is method = cdos (conjugate directions with orthogonal shift). This original line-search derivative-free optimization method is reliable for non-differentiable functions and for costrained optimization. See CDOS method description. When method = powell, or method = brent is specified the Powell's conjugate direction method or Brent's principal axis method is used. The Powell's and Brent's methods are ones of the fastest line-search derivative-free optimization methods. These methods are good for badly scaled differentiable functions. But these methods are very unreliable for non-differentiable functions and for costrained optimization task when extremum point belong to constraint bound. When method = quadratic is specified the successive quadratic approximation of the objective function is performed in main iterative cycle. The one dimensional line search is performed between two successive extremum points of quadratic approximation. The quadratic approximation method is the fastest derivative-free optimization method for quadratic functions but it is rather unreliable for non-quadratic functions. It is analogous to Newton–Raphson method but without usage of derivatives. The all four methods have quadratic convergence for quadratic functions.