Proof of directional derivatives must vanish at bistable points.

63 Views Asked by At

I am self learning optimization from the book "Non-convex Optimization for Machine Learning".

I stumbled upon the below statement,

"Directional derivatives must vanish at bistable points."

I don't know how can one prove this, can't we just use Fermat's theorem that gradients must vanish at maxima/minima? And the way that bistable points are achived makes then kinda minima points?

Here is how they are obtained ...

Any point (x_1,y_1) is a bistable point if it satisfies :-

f(x_1,y) < f(x,y) For all x,y

and

f(x,y_1) < f(x,y) for all x,y

Any help is appreciated. Thanks...

1

There are 1 best solutions below

0
On

$$\frac{\partial f}{\partial x}(x_1, y_1) = \lim_{\delta-\to 0}\frac{f(x_1+\delta, y_1)-f(x_1, y_1)}{\delta}=0$$

since upon fixing $y_1$, $f(x_1,y_1)$ is the minimum value. Similarly for the other direction.