Prove that if the $\gcd(a,b)=1$, then $\gcd(a+b,ab)=1$

21.7k Views Asked by At

I need to prove that:

If $\gcd(a,b)=1$, then $\gcd(a+b,ab)=1.$

So far I used what's given so I have:

$ax+by=1$ (I wrote the gcd of a and b as a linear combination)

and

$(a+b)u+ab(v)=1$ (I also wrote this as a linear combination)

where do I go from here?

5

There are 5 best solutions below

4
On

Assume this is not true.

Let gcd$(a+b,ab)=m>1$. Then there exists a prime number $p$ which divides $m$.

If $p\mid a+b$ and $p\mid ab$, then $p$ divides $a$ or $b$.

Assume that $p\mid a$. But then $p\mid a+b$ implies that $p\mid b$, and hence $p\mid\,$gcd$(a,b)$, which is a contradiction.

Note. We have used the fact that: If $p$ is a prime and $p$ divides $ab$ then $p$ divides $a$ or $b$.

0
On

Proof broken into lemmas:

Lemma 1: If $n$ is rel. prime to $a$ and $n$ is rel. prime to $b$, then $n$ is rel. prime to $ab$.

Proof: multiply together the Bezout identities for the two hypotheses.

Lemma 2: If $a$ and $b$ are rel. prime, then both are rel. prime to $a+b$.

Proof: Its enough to prove if $a$ and $b$ are rel. prime then $b$ is rel. prime to $a+b$. We have the Bezout identity: $a x + by = 1$. Now add and subtract $bx$ on the left.

Conclusion: We are given $a$ and $b$ rel. prime. By the 2nd lemma, $a+b$ is rel. prime to both $a$ and $b$. By the first lemma, $a+b$ is rel. prime to $ab$.

5
On

Below is a proof that is messier but more primitive than Yiorgos' proof. By Bezout's Identity, we have that there exists $x,y \in \mathbb{Z}$ such that

$$ax + by = 1.$$

Squaring both sides, we see that

$$a^2 x ^ 2 + 2abxy + b^2 y^2 = 1.$$

But notice that

$$a^2 x ^ 2 + 2abxy + b^2 y^2 = ab(2xy-x^2-y^2) + (a+b)(ax^2+by^2).$$

And hence, those same $x,y$ as above give

$$ ab(2xy-x^2-y^2) + (a+b)(ax^2+by^2) = 1.$$

So it must be that $\gcd{(a+b,ab)} = 1$.

1
On

I like pushing the limits of the Euclidean algorithm. Let's do our best to elimiante $b$'s:

$$\begin{align} \gcd(a+b, ab) &= \gcd(a+b, ab - a (a+b)) \\&= \gcd(a+b, -aa) \\&= \gcd(a+b, a^2) \end{align}$$

Since we also have $\gcd(a+b, a) = 1$, we can infer that $\gcd(a+b, a^2) = 1$.

1
On

We're given $\gcd(a,b) = 1$, and from Bézout's identity we have $\gcd(a,b) = 1 \iff \exists x,y: ax + by = 1$ for integer $x$ and $y$.

$1 = ax + by = ax + bx - bx + by = (a + b)x + b(y - x)$, so $\gcd(a+b,b) = 1$. Likewise, $\gcd(a+b,a)=1$ because $(a + b)y + a(x - y) = 1$

$$ \begin{align} 1 &= ((a + b)x + b(y - x))((a + b)y + a(x -y)) \\&= (a+b)(a+b)xy + (a+b)a(x-y) + (a+b)b(y-x)y + ab(y-x)(x-y) \\&= (a+b)[(a+b)xy + a(x-y) + b(y-x)] + ab[(y-x)(x-y)] \end{align}$$

So $\gcd(a+b,ab) = 1$.