Why is the average taken this way, when calculating square root of a number?

355 Views Asked by At

I was watching a video lecture from the series Introduction to Computer Science and Programming using Python (see here). It presents an algorithm for computing the square root of a number $x$. It starts with an initial guess $g$, and then taking the average $$\frac{g+\tfrac xg}{2},$$ and repeating. Why did they not take a simple average like $\frac{g+x}{2}$? Here is a screenshot:

Screenshot

Also Any resources which can help to improve my math's understanding related to tackling kind of math problems would also be really helpful

2

There are 2 best solutions below

0
On

Because repeatedly taking the average $\frac{g+\tfrac{x}{g}}{2}$ converges to the square root, whereas repeatedly taking the average $\frac{g+x}{2}$ does not.

0
On

Without any sophisticated theory, the point is that if $g>0$ and $g \neq \sqrt{x}$ then $g$ and $x/g$ are on opposite sides of $\sqrt{x}$, since they multiply together to give $x$. Therefore if the distance between $g$ and $\sqrt{x}$ is about the same as the distance between $x/g$ and $\sqrt{x}$, then the average of the two of them is going to be closer to $\sqrt{x}$ than either of them is.

You can check that $\frac{x}{g}-\sqrt{x}=-\frac{\sqrt{x}}{g}(g-\sqrt{x})$, so the condition I just mentioned basically amounts to "$\frac{\sqrt{x}}{g}$ is reasonably close to $1$". When this isn't true, things can get worse for one iteration and can then take a while to recover. For example, if you want $\sqrt{1}$ and you guess $10^{-12}$ then it's going to take about 40 iterations just to shrink your guess down to the right order of magnitude after it skyrockets to about $\frac{10^{12}}{2}$ in the first step.