For a general non-linear optimization problem in the form of:
$$ \min_x f(x) $$
There exists a number Quasi-Newton methods, which all approximate the hessian $\nabla^2 f$: Examples are steepes-descent, conjugate-gradient, Levenberg-Marquardt and BFGS. The latter is often considered the most accurate approximation of the true hessian.
A slightly different problem is solved by Minpack's lmder, which is used by python's scipy.optimize.leastsq:
$$ \min_x | f(x) - d |^2 $$
Where $d$ is a dataset to fit. The corresponding Conjugate-Gradient update is: $$ r = f(x) - d \\ J = \nabla f(x) \\ H \approx J^T J \\ \Delta x = H^{-1} r $$
Extension to Levenberg-Marquardt is done by $H \approx J^T J + \delta I$. Both are still approximations to the true Hessian.
Does this form of problem also have other approximations to the hessian, such as BFGS?