Global optimization of non-smooth function

989 Views Asked by At

I have a number of functions (see for example two of them down below), and I need to find their global optimum for each of them. They are non-smooth, but they are always funnel-shaped, exhibiting a large minimum. If you zoom out, (e.g. when the x range is 0-100), the function "looks" smooth, so a convex optimization algorithm (golden section search) finds an approximate position for the global minimum quite easily and quickly. The problems arise when I need to refine that prediction, and zoom in (as shown). Theory shows these functions are, in fact, piecewise linear. What algorithm can I use to refine this prediction in a minimum amount of evaluations of the objective function, and with no gradient information?

example

1

There are 1 best solutions below

0
On

One possible suggestion:

First, use your global optimization to define some small set $I$ (e.g. interval) in which to work.

Then, use an optimization algorithm that can handle noise without gradient information. Even simple grid search on $I$ could work. Another idea could be a evolutionary algorithm, like differential evolution. Such algorithms can be easily restricted to $I$. Yet another possibility is a local stochastic search algorithm, like stochastic hill climbing or approaches using simulated annealing, possibly with random restarts (ending early if one leaves $I$). Memetic algorithms combine both.

The fact that it's piecewise linear could be useful, but if the length of the pieces and relation between neighbouring line slopes is truly "random", I'm not sure how to use it.