This is the proof of the euclidean algorithm my CS professor taught us last lecture:
For $a\gt b: \gcd(a,b) = \gcd(a-b,b)$
Let $g=\gcd(a,b), g'=\gcd(a-b,b)$
Then:
$a=q_a \cdot g$ and $a-b = q'_\left(a-b\right) \cdot g'$
$b=q_b \cdot g $ and $b = q'_b \cdot g'$
$a-b=\left(q_a-q_b\right) \cdot g$ and $a =\left(q'_\left(a-b\right)-q'_b\right)\cdot g'$
That means: $g$ divides $a-b,b$ and $g'$ divides $a,b$
Thus: $g \le g'$ and $g' \le g$ $\Rightarrow g = g' $
I don't understand how to arrive from "$g$ divides $a-b,b$ and $g'$ divides $a,b$" to "$g \le g'$ and $g' \le g$". Can you please explain that to me?
This goes back to the formal definition of gcd. In words, the formal definition says that if $d=\gcd(a,b)$, then $d$ is a divisor of both $a$ and $b$, and any other number that is a divisor of $a$ and $b$ is less than or equal to $d$. (That is, to be pedantic, that $d$ is the greatest of all of the common divisors of $a$ and $b$.)
What your professor showed is that $g$ is a common divisor of $a-b$ and $b$, but they already defined $g'$ to be the greatest common divisor of those two numbers. Therefore, $g\leq g'$. At the same time, $g'$ was a common divisor of $a$ and $b$, but $g=\gcd(a,b)$, so $g'\leq g$.