For what $1 \leq a \leq 20$ you are finding $a$ is it true that $a^m+a^n=x^2$ for positive integers $a,m,n,x.$
I did
$a^m+a^n=x^2.$
$=a^m(a^{n-m}+1)=x^2$
We know that since $(a,b)=1$ since the GCD is $1,$ we can say
$= (a^m,a^k+1)=1$
I know that $a^m$ is prime which is the only factor of $a.$ $a$ is not in the factor for $a^k+1.$
I am not sure how you would solve this problem with this. Can someone help me with this?
Since $\gcd(a^m, a^k+1) = 1$, both $a^m$ and $a^k+1$ must be squares.
If $k=1$, then given $1\le a\le 20$ we must have $a\in\{3,8,15\}$. Since none of these possible values of $a$ is itself a square, we must have $m=2r$ even, so we get the three families of solutions
Alternatively, if $k>1$ then using Catalan's conjecture, we see that we must have $a=2$, $k=3$, and $a^k+1 = 9$. Since $a^m = 2^m$ must be a square, we must have $m$ even. So the only solutions in this case are given by $a=2$, $m=2r$ even, $n = m+3$:
Finally, note that the solutions for $a=8$ are a subset of those for $a=2$, so you end up with just three families of solutions, for $a=2$, $a=3$, and $a=15$.