If $a,b$ are positive integers such that $\gcd(a,b)=1$, then show that $\gcd(a+b, a-b)=1$ or $2$ and $\gcd(a^2+b^2, a^2-b^2)=1$ or $2 $.
Progress
We have $\gcd(a,b)=1\implies \exists u,v\in \mathbb Z$ such that $au+bv=1\implies a^2u'+bv'=1, u',v'\in \mathbb Z$.
Let $\gcd(a+b, a-b)=d$. Then $\mid (a+b)x+(a-b)y, \forall x,y\in \mathbb Z$.
How to show $\gcd(a+b, a-b)=1$ or $2$ and $\gcd(a^2+b^2, a^2-b^2)=1$ or $2 $.
Let $d = gcd(a+b,a-b)$. So $d |(a-b)$, and $d|(a+b)$. Thus:
$d |(a-b)+(a+b) = 2a$, and $d |(a+b) - (a-b) = 2b$. So $d | gcd(2a,2b) = 2gcd(a,b) = 2$.
Thus $d = 1 $ or $2$.
For the other one, observe that $2(a^2 + b^2) = (a+b)^2 + (a-b)^2$, and $a^2 - b^2 = (a-b)(a+b)$. Thus if $ d = gcd(a^2+b^2, a^2-b^2)$, then $d |(a^2+b^2)+(a^2-b^2) = 2a^2$, and $d|(a^2+b^2)-(a^2-b^2) = 2b^2$. Thus: $d |gcd(2a^2,2b^2) = 2gcd(a^2,b^2) = 2(gcd(a,b)^2) = 2\cdot 1 = 2$. Thus $d = 1$ or $2$.
Note: $gcd(a^2,b^2) = (gcd(a,b))^2$ is easy to prove.