Problem:
Prove that if gcd( a, b ) = 1, then gcd( a - b, a + b ) is either 1 or 2.
From Bezout's Theorem, I see that am + bn = 1, and a, b are relative primes. However, I could not find a way to link this idea to a - b and a + b. I realized that in order to have gcd( a, b ) = 1, they must not be both even. I played around with some examples (13, 17), ...and I saw it's actually true :( ! Any idea?
The gcd of $x$ and $y$ divides any linear combination of $x$ and $y$. And any number that divides $r$ and $s$ divides the gcd of $r$ and $s$.
If you add $a+b$ and $a-b$, you get
<blank>, so $\mathrm{gcd}(a+b,a-b)$ divides<blank>.If you subtract $a-b$ from $a+b$, you get
<blankity>, so $\mathrm{gcd}(a+b,a-b)$ divides<blankity>.So $\mathrm{gcd}(a+b,a-b)$ divides $\mathrm{gcd}($
<blank>,<blankity>$) = $<blankety-blank>.(For good measure, assuming the result is true you'll want to come up with examples where you get $1$ and examples where you get $2$, just to convince yourself that the statement you are trying to prove is the best you can do).