I need to test some Unconstrained optimization algorithms on the Rosenbrock's banana function so that I can compare the performances of those algorithms.
Now, I have several resources in my hand,
(a) MATLAB's OptimizationToolbox's documentation.
This page talks about some techniques to perform optimization on rosenbrock's function namely,
- Optimization Without Derivatives
- Optimization with Estimated Derivatives
- Optimization with Steepest Descent
- Optimization with Analytic Gradient
- Optimization with Analytic Hessian
- Optimization with a Least Squares Solver
- Optimization with a Least Squares Solver and Jacobian
In this page, they have names of six different methods
- Conjugate Gradient
- Levenberg-Marquardt
- Newton
- Quasi-Newton
- Principal Axis
- Interior Point
(c) Wolfram Mathematica ® Tutorial Collection UNCONSTRAINED OPTIMIZATION!
Its TOC page has the following names,
Methods for Local Minimization
- Newton's Method
- Quasi-Newton Methods
- Gauss-Newton Methods
- Nonlinear Conjugate Gradient Methods
- Principal Axis Method
Methods for Solving Nonlinear Equations
- Newton’s Method
- The Secant Method
- Brent’s Method
Step Control
- Line Search Methods
- Trust Region Methods
Now, I have become crazy reading millions of those names.
Which one are the names of Unconstrained algorithms here?