stagnation of a function and gradient descent

73 Views Asked by At

I'm learning the simple gradient descent optimization. I feel i understood the concept but there is one sentence i don't get.

$\Delta x_i=\beta \frac{df}{dx}$ is the direction of steepest descent, $f$ is the function

Then it is stated

Stagnation of $f$: the scalar product $\langle \frac{df}{dx}, \Delta x \rangle = 0$ directions correspond (i.e., the $(n-1)$-dimensional orthogonal subspace of the gradient)

I don't understand how should i interpret and imagine this stagnation?

Could someone please explain the above?

Thank you very much.