A question about Legendre symbols

63 Views Asked by At

I am reading a book with corrected exercises, and I bumped into the following sentence: $$\left( \frac{2^m + 1}{3} \right)=\left( \frac{2}{3} \right)=-1$$ where we assumed that $2^m+1$ is prime. I have no clue about how this fact some from...

2

There are 2 best solutions below

0
On

This is simple. If $2^m+1$ is prime, then $m$ is even: for odd $m$ we have $$2^m+1=2^{2n+1}+1=2 \cdot 4^n+1 \equiv 2 \cdot1^n+1 \mod 3 \equiv 0$$ And for even $m$, we have $$2^m+1=2^{2n}+1=4^n+1 \equiv 1^n +1 \mod 3 \equiv 2$$

0
On

If $2^m+1$ is prime, then $m$ must be a power of $2$. To see this, let $m=ab$ where $b$ is an odd number $>1$. Then we may write $$\frac{(2^a)^b+1}{2^a+1}=2^{a(b-1)}-2^{a(b-2)}+\ldots+2^{a+1}-2^a+1=n$$ This follows from the factor theorem since $x^a=-1$ is a root of $(x^a)^b+1$. Therefore we have $$2^m+1=(2^a+1)n$$ But both $2^a+1,n>1$, so $2^m+1$ is never a prime. Therefore $m$ can't have any odd factors, and so $m$ must be a power of $2$.

We may therefore write $2^m+1=2^{2^k}+1$, now working over $\Bbb Z_3$ what is $2^{2^k}+1\equiv_3(-1)^{2^k}+1$?