How does MATLAB optimization tools works?
They just get the error function and doesn't require Jacobian (First derivatives) or Hessian (Second derivatives)?
How is it possible?
How does MATLAB optimization tools works?
They just get the error function and doesn't require Jacobian (First derivatives) or Hessian (Second derivatives)?
How is it possible?
Copyright © 2021 JogjaFile Inc.
Since, given the function and assuming it is differential, one could use numerical methods to approximate the Gradient and the Hessian of a function at a given point.
One method doing so is the Finite Differences method.
The problem with this kind of method is that for high dimension problems those methods require high amount of calculation of the objective function at the given point (For the Gradient, at least $ n $ times for a function in dimension $ n $).
There are methods to ease this calculations as done in the Quasi Newton Method, yet still it is not negligible.
Hence, if you have the analytic function for the Gradient and the Hessian it is better to supply them to MATLAB.