If $(a,b)=1$ then prove $(a+b, ab)=1$.

15k Views Asked by At

Let $a$ and $b$ be two integers such that $\left(a,b\right) = 1$. Prove that $\left(a+b, ab\right) = 1$.

$(a,b)=1$ means $a$ and $b$ have no prime factors in common

$ab$ is simply the product of factors of $a$ and factors of $b$.

Let's say $k\mid a+b$ where $k$ is some factor of $a$.

Then $ka=a+b$ and $ka-a=b$ and $a(k-l)=b$.

So $a(k-l)=b, \ a\mid a(k-1)$ [$a$ divides the left hand side] therefore $a\mid b$ [the right hand side].

But $(a,b)=1$ so $a$ cannot divide $b$.

We have a similar argument for $b$.

So $a+b$ is not divisible by any factors of $ab$.

Therefore, $(a+b, ab)=1$.

Would this be correct? Am I missing anything?

4

There are 4 best solutions below

3
On

Your jump from "$k\mid a+b$ with $k\mid a$" to $ka=a+b$ seems to be wrong. Just because $k$ is a factor of $a$ doesn't mean at all that the number of times it divides $a+b$ is exactly $a$. That seems to kill the rest of your argument.


Instead, here is an approach that doesn't mention prime factors at all. It starts from the well-known property that $(a,b)=1$ exactly if there are $p,q\in\mathbb Z$ such that $pa+qb=1$.

Square $pa+qb=1$ to get $$ p^2a^2 + q^2b^2 + 2pqab = 1 $$ If we can show that each of the terms on the left-hand side of this is an integer combination of $a+b$ and $ab$, then the right-hand side is too.

But $2pqab$ is clearly an integer combination of $a+b$ and $ab$ namely $0(a+b)+2pq\cdot ab$.

And $a^2 = a(a+b)-ab$, so $p^2a^2 = p^2a(a+b)-p^2\cdot ab$.

Similarly $q^2b^2 = q^2b(a+b)-q^2\cdot ab$.

Collecting everything, $(p^2a+q^2b)(a+b)+(2pq-p^2-q^2)ab=1$, so $(a+b,ab)=1$.

0
On

You say:

Let’s say $k\mid a+b$ where $k$ is some factor of $a$.

Then $ka=a+b$ ...

This step really doesn’t make any sense as written. If $k\mid a+b$, there is some integer $m$ such that $km=a+b$, but there’s certainly no reason to think that $m=a$. What you want to do at this point is use the hypothesis that $k\mid a$: there is some integer $n$ such that $a=kn$, and therefore $km=kn+b$. Now you can solve for $b$ and observe that $k\mid b$. But then $k$ is a common divisor of $a$ and $b$, so $k=\pm1$, and you’ve shown that $\gcd(a,a+b)=1$.

A similar argument shows that $\gcd(b,a+b)=1$, and then your last step is fine.

2
On

Suppose that the gcd is not $1$. Then there is a prime $p$ that divides $ab$ and $a+b$.

But then $p$ divides one of $a$ or $b$, say $a$. Since $p$ divides $a+b$, it follows that $p$ divides $b$. This contradicts the fact that $a$ and $b$ are relatively prime.

1
On

Suppose that gcd(a,b)=1.

Let d=gcd(a+b,ab).

Now d|(a+b) and d|ab.

Suppose that d|a.

Then a=dqa.

Now examining a+b=dq1 we have dqa+b=dq1 b=d(q1−qa).

So then d|b and d=1.

A similar argument shows d|b implies d|a.

So we know that d|a or d|b implies d=1.

Suppose that d∤b and d∤a.

We have a+b=dq1 a=dq1−b.

Here we have that gcd(a,−b)=1=gcd(a,d) (this is from a Lemma used in the proof of the Euclidean Algorithm).

We can also derive that gcd(b,−a)=1=gcd(b,d).

So we know that a and d and b and d are both relatively prime.

Considering d|ab we know by Euclid's Lemma that since a and d are relatively prime implies that d|b.

This contradicts our assumption, so this case is not possible.

Therefore, we may conclude that d=1. □