Is there an optimization method with these characteristics?

66 Views Asked by At

I have been looking at tons of papers recently, however I'm not expert in the topic. I am looking for a method/algorithm that has all (or many) of these properties:

  • derivative-free (non gradient based)
  • Parallelisable
  • can cope with multiple local extrema & discontinuous objective function

If possible it should also be simple to implement and asynchronous, but these are extra. Is there any method having all of the 3 above? I looked at GPS, GSS, golden section, Fibonacci, hooke & Jeeve's and Nelder-Mead's methods but I found nothing.

1

There are 1 best solutions below

4
On BEST ANSWER

Suppose you have a universe of solutions, there is an energy function $E$ on the set of solutions, and each solution $x$ has a neighborhood $N(x)$ consisting of all solutions which are obtained from $x$ by small pertubations.

Start from high temperature $t=T_\max$ and pick a solution $x$ at random.

In each step, decrease the temperature say by 1%, i.e., $t\leftarrow t*0.99$. Pick a solution $y\in N(x)$ at random. If $E(y)>E(x)$, then $y<\leftarrow x$ and resume with $y$ by setting $x\leftarrow y$. (So far, this looks like hill climbing or steepest ascent.) Otherwise, a random number $r\in[0,1]$ is drawn. If $r <\exp((E(y)-E(x))/T)$, then $y\leftarrow x$. (This allows to escape from local optima.) Otherwise, do not change $x$. Now resume.

Exit if a given number of iterations is reached or the temperature is low enough.

When the temperature is high, it is easy to escape from local optima since the Boltzmann factor $\exp((E(y)-E(x))/T)$ is close to 1. However, when the temperature is low, the Boltzmann factor is close to 0 and so the probability test will hardly be successful.

Theoretically, under mild assumptions (infinite cooling plan), the algorithm is guaranteed to reach the global optimum. I hope it helps.