Prove that $x^2+1$ cannot be a perfect square for any positive integer x?

2.9k Views Asked by At

I started this problem by trying proof by contradiction.

I first noted that the problem stated that $x$ had to be a positive integer, and thus $x=0$ could not be a solution. I then assumed that $x^2+1=n^2$ for some integer $n$ other than $1$. From here I have tried various methods, to no avail:

  • Factoring:

$n^2-x^2=1\implies(n+x)(n-x)=1$. It would be nice if I could say $(n+x)=(n-x)=1$ and $(n+x)=(n-x)=-1$. However, the issue here is that $(n+x)$ and $(n-x)$ could take on any value. For example, $(n+x)=2$ and $(n-x)=\frac{1}{2}$, or $(n+x)=3$ and $(n-x)=\frac{1}{3}$. Thus, I ruled out factoring.

  • For any number $n$, $n\equiv 0 \pmod4$ or $n\equiv 1 \pmod 4$.

The trouble with this is that I would have to prove the above statement, so I ruled this out.

Does anyone have any tips on how to continue? I feel like this should be an easy proof, but no solutions are coming to me, without having to prove something else.

4

There are 4 best solutions below

3
On BEST ANSWER

Your factoring is abosolutely correct. However, since $x,n$ are integers, this gives us $x-n$ is a natural number. Thus $x-n = \pm 1$.

An alternate proof would use that a square can not exist between two consecutive squres and $$x^2 < n^2 < x^2+2x+1$$

1
On

For any $n$, the square is $n^2$.

The following integer is $n+1$, and its square is $n^2 + 2n + 1$.

Since (assuming $x>0$) $$x^2 < x^2+1 < x^2 + 2x + 1$$

it also true that $$x = \sqrt{x^2} < \sqrt{x^2+1} < \sqrt{x^2 + 2x + 1} = x+1$$ but there are no integers between $x$ and $x+1$.

2
On

You could try proving by contradiction by assuming that $x^2+1$ is a perfect square for some $x∈\mathbb{Z}$ then show how this cannot be possible

5
On

Just for fun (and also because this approach could apply to other similar problems), I propose a solution using Gaussian integers. So consider the extension Q(i)/Q and its norm map N. The diophantine equation $x^2$ +1 = $n^2$ can also be written N(x + i) = N(n), or equivalently x + i = nz, where z is an element of Q(i) with norm 1. According to Hilbert's thm. 90, such a z is a quotient (a + ib)/(a - ib), where a priori a and b belong to Q. But the form of the fraction obviously allows us to take for a and b two coprime integers. Solving the real and imaginary parts, we get the two equations: x = n ($a^2$ - $b^2$)/($a^2$ + $b^2$) and 1 = 2 n ab/($a^2$ + $b^2$), which give in turn 2 x = ($a^2$ - $b^2$)/ab. But the RHS fraction is irreducible: if a prime p divides the denominator, p will for instance divide a but not b, hence p will not divide the numerator. This contradicts the integrality of the LHS. Note that we did not need to suppose x positive.