Finding an optimization function that is differentiable but its gradient and hessian are very hard/impossible to compute

38 Views Asked by At

I am looking for a test function that is differentiable since I'd like to test some gradient-based optimization methods. But the function's derivatives (both gradient and hessian) should be very hard (if not impossible) to compute analytically, any example?

(To give a counterexample, a multivariate Rosenbrock function may be an easy example, though it still takes some time for one to compute its derivatives)