If $\gcd(a,b)=1$ then $\gcd(a^2-b^2,a^2+b^2)=1$

108 Views Asked by At

If $\gcd(a,b)=1$, show that $\gcd(a^2-b^2,a^2+b^2) = 1 \text{ or } 2$.

I tried this: $\gcd(a,b) = 1$.

Then, $\gcd(a^2, b^2)=1$.

Now, $a^2+b^2$ is divisible by $2$, and $a^2-b^2$ also divisible by $2$. So $2$ is a gcd of $a^2-b^2$ and $a^2+b^2$.

But what about $2$ or $1$?

3

There are 3 best solutions below

0
On

Using the fact that $\gcd(p,q) = \gcd(p, q \pm p)$,

$$\gcd(a^2-b^2, a^2+b^2) = \gcd(a^2-b^2, 2a^2)$$

If $a^2-b^2$ is odd, then

$$\gcd(2a^2, a^2-b^2) = \gcd(a^2, a^2-b^2) = \gcd(a^2, -b^2) = \gcd(a, -b) = 1.$$

Otherwise, if $a^2-b^2$ is even, then

$$\gcd(2a^2, a^2-b^2) = 2\gcd(a^2, a^2-b^2) = 2\gcd(a^2, -b^2) = 2\gcd(a, -b) = 2.$$

0
On

Any prime power dividing $a^2\pm b^2$ divides $2a^2,\,2b^2$, so would be $1$ or $2^1$.

0
On

Oh, just throw rocks at it.

$\gcd(x,y) = \gcd(x, x \pm y)$ so we can get:

$\gcd(a^2 -b^2, a^2 + b^2) = \gcd(a^2 -b^2, 2a^2)$

Ane $\gcd(a^2 -b^2, a^2 +b^2) = \gcd(a^2-b^2, 2b^2)$

$=\gcd(a^2 +b^2, 2a^2) = \gcd(a^2 + b^2, 2b^2)$.

So if $K = \gcd(a^2 - b^2, a^2 + b^2)$ then $K|2b^2$ and $K|2a^2$ but $a$ and $b$ have no factors in common so $K|2$ so $K = 1$ or $K = 2$.