If we define, for any positive number $A$:
\begin{align*} g(x) &= \frac{1}{2}x + \frac{A}{2x} \\ g'(x) &= \frac{1}{2} - \frac{A}{2x^2} \\ \end{align*}
And we define the normal fixed point iteration sequence $\{ p_n \}_{n=0}^\infty$ based on $p_n = g(p_{n-1}), n \ge 1$
How can we prove that this will converge to $\sqrt{A}$ when $p_0 > 0$? What happens if $p_0 < 0$?
The normal textbook proof of fixed point convergence depends on $g'(x) \le k < 1$ which is clearly not the case here.
Method 1: Show that if $0<x_0<\sqrt{A}$ then $x_1>\sqrt{A}$. Show that if $x_n>\sqrt{A}$ then $\sqrt{A}<x_{n+1}<x_n$. Conclude that the sequence except possibly for the first iterate is bounded and monotone, hence convergent. Show that the limit of a recursive sequence $x_{n+1}=F(x_n)$ for a continuous function $F$ must be a fixed point of $F$. Apply that to $g$ with the domain $(0,\infty)$.
Method 2: Notice that $|g'(x)|<1$ if $x>\sqrt{\frac{A}{3}}$. Consider $g$ with the domain $[a,b]$ where $\sqrt{\frac{A}{3}}<a<\sqrt{A}$ and $b>g(a)$. Then $|g'(x)| \leq k<1$ on $[a,b]$. Show that $g$ maps $[a,b]$ into itself and then the Banach fixed point theorem applies.
Method 2 is not as general as method 1 because it says you need to start above $\sqrt{\frac{A}{3}}$, which you actually don't. But you can essentially turn method 2 into method 1 by noticing that $g((0,\infty))=[\sqrt{A},\infty)$ and then applying the Banach fixed point theorem on $[\sqrt{A},b]$ for whatever $b$ you like.