What are the optimization algorithms that deal with functions that have many optimums?

73 Views Asked by At

I have a function is not unimodal, it has multimodes. The function does not have a global optimum as such but has several optimums. Is there an optimization algorithm that deals with functions that have several maximas and minimas ( analytical techniques or numerical techniques) I believe that having several optimal solutions is different than the existence of local maxima or minima as the term signifies that there is also a global maxima/minima that exists. While my function has various maximas and minimas. Please correct me if I am wrong?

The function is complicated and therefore natural techniques of differentiation wouldn't be of much use.

2

There are 2 best solutions below

0
On BEST ANSWER

The choice of the optimization algorithm very much depends on how many parameters your objective function has. There are a number of global solvers that are able to return multiple optima, whether they are at the same "level" or not.

For low-dimensional problems (i.e., less than 10 variables), techniques such as Multilevel Coordinate Search [1], Tunneling [2] and the SciPy global solver SHGO [3] can be very effective in returning multiple optima.

For higher-dimensional problem, I believe @dtn suggestion is your best bet, i.e., starting from a different start point multiple times and run a local optimization algorithm on each one of them. Assuming you can ensure a relatively good sampling with your N random starting points (for example, by using Latin Hypercube sampling or some other "optimal space exploring" random starting point generator), your local solver will hopefully return different optima.

[1] https://arnold-neumaier.at/software/mcs/index.html#:~:text=MCS%20is%20a%20Matlab%20program,done%20via%20sequential%20quadratic%20programming.

[2] https://github.com/andyfaff/ampgo/blob/master/%20ampgo%20--username%20andrea.gavana%40gmail.com/go_amp.py

[3] https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.shgo.html

0
On

(there are several options in other answers, but:)

When all other theory goes out the window, and you have to get an answer to your boss ASAP, try simulated annealing. But be warned, just like most methods for this class of (nonlinear/nonconvex) problem, there is not much theory backing it up. That said, it often works "well enough" in practice.

Another approach would be Branch and Bound.