What's the relationship between automatic differentiation and gradient method?

138 Views Asked by At

I'm learning about shape optimization and in the numerical methods of shape optimization I've seen the terms automatic differentiation and gradient method.

Doing a Google search gives an impression that these two are related. But are they not the same thing? How are they related?

1

There are 1 best solutions below

0
On BEST ANSWER

Gradient methods (such as the gradient descent method) for minimizing a differentiable function $f:\mathbb R^n \to \mathbb R$ require evaluating the gradient of $f$ at each iteration. Unfortunately, sometimes there is no nice formula for the gradient of $f$. So what do you do?

One option is to estimate the gradient of $f$ numerically, using finite difference formulas. This approach requires that you are at least able to evaluate $f$ at a given point $x$.

Another approach is to use "automatic differentiation" software. If you have code that is able to evaluate $f$, then automatic differentiation software might be able to read your code and generate new code that is able to compute the gradient of $f$ exactly (except for roundoff error). (This is achieved by, in some sense, systematically applying the chain rule.)

In many cases this is better than estimating the gradient using finite differences, because you are now computing the gradient exactly (to machine precision).