Given a multi-dimensional function $f(x_1, x_2, ..., x_n)$ and its gradient with partial derivatives: $$\nabla f = \begin{bmatrix} \frac{\partial d}{\partial x_1}\\ \frac{\partial d}{\partial x_2}\\ ...\\ \frac{\partial d}{\partial x_n}\\ \end{bmatrix}$$
Assume, a step-by-step algorithm needs $\nabla f$ and the $\textit{Hessean}$ matrix $H_f$ in each step for approximating an optimization problem, what would be more efficient (computation wise) for approximating the $\textit{Hessian}$ matrix:
Using Sherman-Morrison formula in each step (which is done in $\mathcal{O}(n^2)$): https://en.wikipedia.org/wiki/Sherman%E2%80%93Morrison_formula
Using methods like $f_{x_1}^{''} = \frac{\nabla f(x_1 + h, x_2, ..., x_n) - \nabla f(x_1, x_2, ..., x_n)}{h}$ for each matrix element - I don't know how much operations this method needs in terms of $\mathcal{O}$