I have a number of functions (see for example two of them down below), and I need to find their global optimum for each of them. They are non-smooth, but they are always funnel-shaped, exhibiting a large minimum. If you zoom out, (e.g. when the x range is 0-100), the function "looks" smooth, so a convex optimization algorithm (golden section search) finds an approximate position for the global minimum quite easily and quickly. The problems arise when I need to refine that prediction, and zoom in (as shown). Theory shows these functions are, in fact, piecewise linear. What algorithm can I use to refine this prediction in a minimum amount of evaluations of the objective function, and with no gradient information?
Global optimization of non-smooth function
991 Views Asked by yannick https://math.techqa.club/user/yannick/detail AtThere are 3 best solutions below
Unfortunately, without having bounds on the rate of change, neither continuity nor piecewise-linearity will help you. Finding the exact optimum in this case is hopeless.
If you want to find an approximately optimal solution for a given approximation level, the only algorithm I see which can guarantee you that is to slice up the domain in steps as small as the desired approximation level and check the function value at all of them.
You could make the linear search more efficient if you knew an upper bound on the Lipschitz constant, or a lower bound on the distance between breakpoints (where the slope of the piecewise-linear function changes), as they would allow you to make jumps in the search space. You already "know" that your function is "funnel shaped", so maybe you can try to formulate that by bounding the difference of your function to some convex function, and use that bound to make the linear search more efficient.
I'd fit a spline, and find its minimum. Finding-inflection-points-in-spline-fitted-1d-data on SO has a link to nice python code to do this: https://gist.github.com/pv/5504366 .
Added: why spline fit ? Splines can trade off conflicting goals:
- fit the data
- smooth out noise
- find (local) minimum points
Splines fit a smooth curve to data, given either a number $N$ intermediate points ("knots") or a desired degree of fit. Either way, minima are easily found.
Splines are usually piecewise cubic, but in your case you might try piecewise-linear splines.
One possible suggestion:
First, use your global optimization to define some small set $I$ (e.g. interval) in which to work.
Then, use an optimization algorithm that can handle noise without gradient information. Even simple grid search on $I$ could work. Another idea could be a evolutionary algorithm, like differential evolution. Such algorithms can be easily restricted to $I$. Yet another possibility is a local stochastic search algorithm, like stochastic hill climbing or approaches using simulated annealing, possibly with random restarts (ending early if one leaves $I$). Memetic algorithms combine both.
The fact that it's piecewise linear could be useful, but if the length of the pieces and relation between neighbouring line slopes is truly "random", I'm not sure how to use it.