How do I find the local minima of a function from $\mathbb{R}^3\to\mathbb{R}$?

548 Views Asked by At

For a function $f:\mathbb{R}\to\mathbb{R}$, the way to find the local minima of a function is to find all $x$'s such that $f'(x) = 0$ and $f''(x) > 0$. Notice that finding the second derivative is not optional; you need to compute the second derivative in order to determine if it is a minimum. EDIT: This test doesn't work for $f(x) = x^4$. Nevermind

What is the step-by-step procedure for finding all local minima of a function $g : \mathbb{R}^3 \to \mathbb{R}$?

1

There are 1 best solutions below

2
On BEST ANSWER

Find the gradient and then solve for it equal to 0. You can compute the hessian to determine it is local min or max. If you are interested in the global min, usually I just compare the values, which might be quicker in case of high dimension...

Simple case $f(x,y,z) = x^2 + y^2 + z^2$, the gradient is $(2x,2y,2z)'$, solve for it to zero yields x=y=z=0.

To see how the property of Hessian informs min/max/saddle, this post already has very nice explanations Why/How does the determinant of the Hessian matrix, combined with the 2nd derivatives, tell us max., min., saddle points? Reasoning behind it?