According to Numerical Methods for Engineers, if $x_{i+1} = g(x_i)$ is the function used for iteration, the magnitude of the derivative, $|g'(x)|$, must be less than 1 if it is convergent.
it should be clear that fixed-point iteration converges if, in the region of interest, $|g'(x)| < 1$. In other words, convergence occurs if the magnitude of the slope of $g(x)$ is less than the slope of the line $f(x) = x$.
But what is the region of interest?
Fixed point iteration is an open method, so what is the region where it should satisfy that condition? Couldn't it satisfy that in some part and not satisfy it in another with the same $g(x)$?
The region of interest is the region near the fixed point, let's call it $z$. You can show that:
If $|g'(z)|<1$, you can choose $x_0$ such that the fixed point iteration converges.
If $|g'(z)|>1$ the fixed point iteration cannot converge, unless, by pure chance, $x_k=z$ for some $k$.
These are local conditions for convergence and divergence. The fixed point the theorem, however, involves an interval, making it more clear what the region of interest is. If some conditions are met in the interval, the convergence will occur for all initial approximations $x_0$.