Nelder Mead Search Vs conjugae gradient decent

1.2k Views Asked by At

I was reading about these two algorithms but was wondering how the choice of initial guess affects the two algorithms and which would be best if we are picking a random guess?

1

There are 1 best solutions below

0
On BEST ANSWER

One of the huge advantages of Nelder Mead is that you don't have to explicitly define your objective function in order to find your next iterate. You just need a relative ordering of which "experiment" was better than others. This makes the algorithm much more amenable to things like experimental design -- where it is hard to determine an actual objective value, but scientists can compare outputs of experiments with many variables and heuristically determine which experiment was "better" than the next. Since you only need a relative ordering, this is nice.

Conjugate gradient only works when you've got a well-defined objective function, and that function is smooth, and you can actually compute its gradient. For this extra cost, there are proofs stating that the algorithm will converge at a pretty fast rate on a wide class of problems. On the other hand, since Nelder Mead is designed on a heuristic, to my knowledge there are not many results guaranteeing that it will converge to a meaningful answer, or that it will converge as quickly as CG.

As far as initialization goes, I think there are a lot of heuristics out there (e.g. you don't want to start up Nelder Mead with your initial experiments too close to each other), but I don't there are any guarantees saying one method is always better than the other. If function evaluations are costly, CG only requires one initial iterate. I'm sure there are other heuristics for initializing CG, but I'm not too familiar with them. My numerical analysis prof always suggested initializing at zero or a random vector on the unit sphere.