When applying the globalized BFGS algorithm (Quasi-Newton Method, optimization, minimization) to approximate the minimum of a function using the Quasi-Newton-Method, sometimes one can get a negative gradient when trying to figure out the search direction.
What could be possible methods (or algorithms) to calculate an approximation of the hessian matrix, which has to be updated for the next step (iteration)?
Thanks in advance!