I have a doubt. I read somewhere that gradient descent will diverge if the chosen step size is too large. But the gradient descent say using exact line search says chose a step size only if it moves down, i.e.,
$$ f \left( x_{k+1} \right) < f \left( x_k \right) $$
What I read that led to this doubt is the following slide:

Ideally, it choses the direction of negative gradient (this itself says that the direction is towards the inner contours) and the step size is chosen till the point (using exact line search) where $f$ keeps on decreasing. So, at max which will be reached in case of anisotropic or [circular form] it should straight end at the minimum.
Gradient Descent Method means each iteration you move from the current point to the next using the opposite direction of the gradient.
Each iteration is a function of 2 parameters:
The Gradient Descent direction only promises there is a small ball which within this ball the value of the function decrease (Unless you're on a stationary point).
Yet the size (Radius) of this ball isn't known.
There are many algorithms to find a valid step size.
One of them (Probably the hardest) is the Exact Line Search.
In practice better choice would be Backtracking.
For large Step Size you may get outside the "Ball" where the function is decreasing and practically find a worse point.
Iterating this might cause a diverge.