Why does NLopt have L-BFGS but not BFGS?

375 Views Asked by At

NLopt, the "free/open-source library for nonlinear optimization, providing a common interface for a number of different free optimization routines..." has a L-BFGS routine but seemingly, no BFGS routine. The closest I can see in the list of algorithms might be the SLSQP routine which does use BFGS. I have two questions:

  1. Why is there no BFGS routine?
  2. Should I just use the L-BFGS routine instead?

Some context. I am using R which has a basic optim function that allows you to choose BFGS and L-BFGS as algorithms. However, one situation where it is not possible to use L-BFGS instead of BFGS is when the objective function evaluates to NaN at points of the search space. My problem only involves about ten parameters so memory is not an issue.

1

There are 1 best solutions below

0
On BEST ANSWER

Although the algorithms are similar, their implementation is quite different: BFGS often constructs and stores the approximated Hessian explicitly, while L-BFGS uses the two-loop recursion by Jorge Nocedal. You can mimic BFGS by setting nlopt_set_vector_storage to a large number (at least the number of iterations).