I'm new to number theory, I just understood the proof of Euclidean Algorithm and how it cleverly uses the fact that $\mathrm{gcd}(a,b) = \mathrm{gcd}(b,r)$ repeatedly, where $a$ is the dividend, $b$ is the divisor and $r$ is the remainder.
Although one thing, I still don't understand is that, why do we always eventually end up with a zero anyway? What's the logic behind this?
There are a lot of fantastic answers to this question already, but here is another perspective on the problem.
We seek to show that $\forall a,b,\;a\geq b$, $r<\frac12a$.
Case 1: $b> \frac12a\;\to$ We know that in the expression $a=q\cdot b+r$, $q=1$. So, $$r=a-b<a-\frac12a=\frac12a$$
Case 2: $b\leq\frac12a\;\to$ Since $0\leq r<b\leq\frac12a$, $r<\frac12a$.
Either way, $r<\frac12a$.
This is very useful since it shows that the Euclidean Algorithm reduces rather quickly, and shows us that the Euclidean Algorithm has to converge to $0$.