Let's assume I have an initial value problem of the form $dx/dt = -\nabla f(x, t)$, for $t \in [0, T]$, where $x \in \mathbb{R}^d$ and $f : \mathbb{R}^d \times [0, T] \to \mathbb{R}^d$.
Are there any numerical solvers that use higher order derivatives of $f$ such as the Hessian matrix (or diagonal approximation of the Hessian matrix) to speed up convergence? In my setup, evaluating $f$ is costly but when evaluating $f$ I get derivatives such as gradient and Hessian basically for free.