What's special about the greatest common divisor of a + b and a - b?

614 Views Asked by At

Problem:

Prove that if gcd( a, b ) = 1, then gcd( a - b, a + b ) is either 1 or 2.

From Bezout's Theorem, I see that am + bn = 1, and a, b are relative primes. However, I could not find a way to link this idea to a - b and a + b. I realized that in order to have gcd( a, b ) = 1, they must not be both even. I played around with some examples (13, 17), ...and I saw it's actually true :( ! Any idea?

3

There are 3 best solutions below

8
On BEST ANSWER

The gcd of $x$ and $y$ divides any linear combination of $x$ and $y$. And any number that divides $r$ and $s$ divides the gcd of $r$ and $s$.

If you add $a+b$ and $a-b$, you get <blank>, so $\mathrm{gcd}(a+b,a-b)$ divides <blank>.

If you subtract $a-b$ from $a+b$, you get <blankity>, so $\mathrm{gcd}(a+b,a-b)$ divides <blankity>.

So $\mathrm{gcd}(a+b,a-b)$ divides $\mathrm{gcd}($<blank>,<blankity>$) = $<blankety-blank>.

(For good measure, assuming the result is true you'll want to come up with examples where you get $1$ and examples where you get $2$, just to convince yourself that the statement you are trying to prove is the best you can do).

1
On

Note that $d|(a-b)$ and $d|(a+b)$ where $d = \gcd(a-b, a+b)$. So $d$ divides the sum and difference (i.e. $2a$ and $2b$).

0
On

Hint $\rm\,\ a\!-\!b + (a\!+\!b)\ {\it i}\ =\ (1\!+\!{\it i})\ (a\!+\!b\!\ {\it i})\ \ $ yields a slick proof using Gaussian integers. This reveals the arithmetical essence of the matter and, hence, suggests obvious generalizations.