According to me, I can find the GCD of two integers (say $a$ and $b$) by finding all the common factors of them, and then finding the maximum of all these common factors. This also justifies the terminology greatest common divisor.
However, the general definition used is that $d$ is said to be a GCD of $a$ and $b$ if
- $d$ divides both $a$ and $b$; and,
- If $d'$ also divides both $a$ and $b$, then $d'$ divides $d$.
My question is that why do we usually accept the second definition over the first. To me the first one seems very intuitive and simple, and does justice to the terminology. The same query goes for LCM as well.
Looking forward to your response. Thank you!
For two reasons:
1) You can calculate the GCD and LCM without knowing anything about prime numbers and prime factorization.
2) What is the GCD of 13121341 and 234132431 ?
Prime factorization cannot help you here, the prime factorization problem is pretty hard even for computers. Actually most of the internet security is based on the fact that there is no known algorithm for factoring large numbers in reasonable time.
You can calculate this GCD using the Euclidean Algorithm, which can be proven easily using the second definition, and has nothing to do with prime factorization.
I think the second reason is usually the main one...