Running time of the number of steps to find GCD(a,b)

665 Views Asked by At

Let $t_n$ be the largest (worst case) number of steps being used to find the greatest common divisor $GCD(a,b)$ when $n\geq a\geq b >0$.

Is it true that the reason why $t_n$ is $\mathcal{O}(n)$ and not $\mathcal{O}(1)$ because the number of steps can never be larger than the smallest integer in $GCD(a,b)$?

And the reason why the worst case running time isn't $\mathcal{O}(1)$ is because it can never take constant time to find the number of steps?

Note: Say I want to find GCD(13,8)

\begin{align} 13 & = 1 \cdot 8 + 5 \tag{step 1} \\[0.5em] 8 & = 1 \cdot 5 + 3 \tag{step 2} \\[0.5em] 5 & = 1 \cdot 3 + 2 \tag{step 3} \\[0.5em] 3 & = 1 \cdot 2 + 1 \tag{step 4} \\[0.5em] 2 & = 1 \cdot 2 + 0 \tag{step 5} \end{align}

In this case $GCD(13,8) = 1$, but the number of steps would be 5, i.e $t_n = 5$