Simple upper bound on number of iterations of Euclidean algorithm

115 Views Asked by At

This is from the book "An Introduction to the Theory of Numbers" by Niven, Zuckerman and Montgomery.

The exact number of iterations $j$ of the Euclidean algorithm required to calculate $(b,c)$ depends in an intricate manner on $b$ and $c$, but it is easy to establish a rough bound for $j$ as follows: If $r_i$ is small compared to $r_{i-1}$ say $r_i\leq r_{i-1}/2$, then substantial progress has been made at this step. Otherwise $r_{i-1}/2<r_{i}<r_{i-1}$, in which case $q_{i+1}=1$ and $r_{i+1}=r_{i-1}-r_i<r_{i-1}/2$. This we see that $r_{i+1}<r_{i-1}/2$ in either case. From this it can be deduced that $j<3\log c$ ($\log$ is base $e$). With more care, we could improve on the constant $3$, but it is comparable to $\log c$, and so it is easy to compute $\textrm{gcd}$ easily, as $\log$ grows slowly.

What I don't understand is - how is it deduced that $j<3\log c$?

1

There are 1 best solutions below

2
On

The value of $c$ at least halves every two steps, and at worst needs to reach $1$, so the number of steps is less than $2 \log_2 c = 2 \log_2 e \log_e c < 3 \log_e c$.