This is the question and solution:
Q: Prove that for any integer $a$, $2a + 1$ and $4a^2$ + 1 are relatively prime. A: Since $4a^2 + 1 = (2a − 1)(2a + 1) + 2$, any common divisor of $2a + 1$ and $4a^2 + 1$ must be a divisor of $2$. This means that $d=1$ or $d=2$. However, $2$ is not a common divisor as both $2a + 1$ and $4a^2 + 1$ are odd. Therefore, the greatest common divisor must be $1$.
Can someone explain how they knew that the common divisor of $2a + 1$ and $4a^2 + 1$ would be a divisor of $2$ (the bolded portion)? Is this some number theory that I just don't know? Also, this topic is $\gcd$ (greatest common divisor in discrete mathematics). Thanks in advance.
I'll start with some examples with actual numbers...
Is $10$ a divisor of $20\times23=460$? Yes. Note that $10\mid20$. This gives us a general rule: If $d\mid a$, then $d\mid ab$.
Is $10$ a divisor of $580-110=470$? Yes. Note that $10\mid570$ and $10\mid110$. This gives us a general rule: If $d\mid a$ and $d\mid b$, then $d\mid a-b$.
(P.S. Those are not proofs, but I hope they give you an idea for why these things are true.)
So, we're assuming that this number, $d$, is a common divisor of $2a+1$ and $4a^2+1$.
Note that $d$ is a divisor of $(2a+1)(2a-1)$, because it's a divisor of one of the factors (our first rule).
We have: $$4a^2+1=(2a+1)(2a-1)+2$$ Subtracting $(2a+1)(2a-1)$ from both sides: $$(4a^2+1)-(2a+1)(2a-1)=2$$ From our second rule, $d$ divides this also.