What is the computational complexity of Newton Raphson method to find square root.

8.6k Views Asked by At

I am not a math student, so I don’t fully understand the complexity as mentioned on Wiki for Newton Raphson method for finding square root. But I wrote a computer program for Newton-Raphson’s method and tried to run on increasing values.

/*Code Snippet */
x = n = 100;
y = 1
i = 0;
while x > y:
    x = (x+y)/2
    y = n/x
    i = i + 1
print "Iterations for convergence: ", i

Values: ( All log values are base 2)*

N=10000 , Iterations : 9 , log(N) = 13.28

N=100000000 , Iterations : 16 , log(N) = 26.57

N=10000000000000000 , Iterations : 30 , log(N) = 53.15

…..and so on…..Assuming the 2 division takes constant value , is the complexity of the method less than log(N)? Atleast that is what I could see from my values. Can someone please try to answer in layman terms if NR method is less than logN or it is greater than equal to logN. You can assume for perfect squares for now.

2

There are 2 best solutions below

2
On BEST ANSWER

It depends on how good your initial guess is. In your program, you take an initial guess of $x=N$. Consequently there will be an initial period where $x/2$ is way bigger than $N/x$, so that the method is essentially just cutting the number in half over and over. So it will take roughly $\log_2(N/\sqrt{N})=1/2 \log_2(N)$ steps to get to within some fixed neighborhood of $\sqrt{N}$. Then there will be a "convergence" period, which takes a number of steps that really only depends on the tolerance, not on $N$. Adding these together will give a number of steps that scales like $\log_2(N)$ as $N \to \infty$ (for a fixed relative tolerance) but is not directly proportional to $\log_2(N)$.

That said, there are better initial guesses. For instance, on a normal computer, you are given $N$ in the form $N=a 2^k$ where $a$ is between $1$ and $2$ (including $1$, not including $2$) and $k$ is an integer. In other words this representation is given to you by the machine with no additional work.

In this case you can take your initial guess to be $2^{\lfloor k/2 \rfloor}$ where $\lfloor x \rfloor$ denotes the greatest integer less than $x$. Then your initial guess is within a factor of $4$ of the desired answer, so that initial "halving" period will only take $O(1)$ time. In fact by precomputing $\sqrt{2}$ you could do even better: $\sqrt{a2^k}=\sqrt{a}2^{k/2}=\begin{cases} \sqrt{2} \sqrt{a}2^{\lfloor k/2 \rfloor} & k \text{ is odd } \\ \sqrt{a} 2^{\lfloor k/2 \rfloor} & k \text{ is even } \end{cases}$, so now your routine can be reduced to just finding a square root of a number in $[1,2)$.

0
On

The average theoretical complexity is $log_2(N)$. However, two notes:

$log_2(N)$ is just the highest order term. The actual number of steps to convergence might be:

constant term + coefficient*$log_2(N)$

Secondly, there can be some nuance about the exact floating point computations, etc. where convergence might occur a step sooner or later, before x and y converge to within "machine epsilon" (i.e. so close that they can't be distinguished).

Basically, don't worry. The number of steps to convergence roughly follows $log_2(N)$. If you did more iterations, it would more perfectly fit $log_2(N)$.