How do you prove that any two integers will always have a greatest common divisor?

674 Views Asked by At

I think my book actually prove it by showing that the set of all linear combinations of two integers is a principal ideal of integers. But I can't see how this proves that two integers will always have a gcd. Is there different way of proving this?

2

There are 2 best solutions below

1
On BEST ANSWER

Well, let's see... say the original numbers are $a$ and $b$, and the ideal of all linear combinations is the principal ideal generated by $d$. That is, every linear combination of $a$ and $b$ is a multiple of $d$. This already shows $d \mid a$ and $d \mid b$, because both $a$ and $b$ are (trivially) linear combinations of $a$ and $b$. Conversely, $d$ itself is a linear combination of $a$ and $b$ (because it is an element of the ideal it generates), so any common divisor of $a$ and $b$ divides $d$ as well. So $\gcd(a,b) = d$ by definition of $\gcd$.

0
On

The Fundamental Theorem of Arithmetic states that any number $n \geq 2$ can be the product of primes.

Case 1A: $p,q \neq $ 1 or 0. p is divisible by q or q is divisible by p. This yields GCF$(p,q) \gt 1$.

Case 1B: $p,q \neq$ 1 or 0. p is not divisible by q and q is not divisible by p. This yields GCF$(p,q)=1$

Case 2: p or q = 1 or 0. This yields GCF$(p,q)=1$