show algorithm to compute square root converge.

33 Views Asked by At

Consider the calculating the square root as follow:

Let's say we want to compute square root of $x>0$, pick a number $g_1>0$, then if $|g_1^2-x| < 0.00001$ then done. Else, let $g_2=\frac{g+\frac{x}{g}}{2.0}$. Show that the algorithm eventually stops. Or, $\lim_{n \to \infty} g_n = x$.

1

There are 1 best solutions below

0
On BEST ANSWER

Hint: write $g_1=\sqrt g+e_1$, so $e_1$ is the error of your first approximation. Let $e_2=g_2-\sqrt g$. Show that $e_2$ is smaller than $e_1$ by a factor you can bound away from $1$. Then $e_n$ is smaller than $e_1$ by more than the $n-1$ power of that factor and you are done.