The problem reads like this:
Prove that if $gcd(a, b) = 1$, then $gcd(a + b, a - b)$ equals to 1 or 2.
Solution: From the linear combinations
$(a + b) * 1 + (a - b) * 1 = 2a$
$(a + b) * 1 + (a - b) * (-1) = 2b$
we know that $gcd(a + b, a - b)$ divides both $2a$ and $2b$. Since $gcd(a, b) = 1$, we conclude that $gcd(a + b, a - b)$ divides 2. Consequently $gcd(a + b, a - b)$ is either 1 or 2.
I tried to follow this proof, but I really got stuck at the From the linear combinations ... we know that ... part. Where do these linear combinations come from? Especially the $(a + b) * 1 + (a - b) * (-1) = 2b$ part is hard to understand, where is the $-1$ coming from?
I'm familiar with Euclids algorithm, but don't understand how the linear combinations above fit into the picture.
Let $d=\gcd(a+b,a-b)$. This means that $d\mid (a+b)$ and $d\mid(a-b)$ and $d$ is the largest such integer. So $d$ must divide any linear combination of $(a+b)$ and $(a-b)$. Now we know that $\gcd(a,b)=1$. So we will make use of this fact. So we take the given linear combinations OR in fact anything else that will work. Then by those linear combinations we see that $d\mid2a$ and $d\mid2b$ because $d$ divides any linear combination of $(a+b)$ and $(a-b)$. Since $\gcd(a,b)=1$, we have integers $r,s$ such that $ar+bs=1$, whence $(2a)r+(2b)s=2$. But since $d\mid 2a$ and $d\mid 2b$ we have $d\mid ((2a)r+(2b)s)$. So $d\mid 2$ and you have $d=1$ or $2$.