Q: Proof Involving Natural Numbers and Perfect Squares

64 Views Asked by At

Let n $\in$ $\mathbb{N}$. Prove that neither n(n+1) nor n(n+2) is a perfect square.

I tried a proof by contradiction and I just want to make sure that I didn't violate anything. I wrote:

Assume n $\in$ $\mathbb{Z}$, and for the sake of contradiction, assume that n(n+1) and n(n+2) are perfect squares.
Thus n(n+1)=a$^2$ and n(n+2)=b$^2$. for arbitrary integers a and b.
n(n+1)=n$^2$+n=a$^2$ and n(n+2)=n$^2$+2n=b$^2$.
Subtracting b$^2$ from a$^2$ we get: -n=a$^2$-b$^2$.
Thus we can see that n=-(a+b)(a-b).

consider three cases: one where a>b, b>a and a=b.

Case I: a > b.
a>b$\implies$ n is negative, since (a+b) will be positive, and (a-b) will be positive, thus the sum of two positives and a negative is negative. this is a contradiction to our assumption that n $\in$ $\mathbb{N}$
thus Case I isn't satisfied.


Case II: b > a
b > a $\implies$ n is positive, since (a+b) will be positive, and (a-b) will be negative, thus the product of two negatives and a positive gives us a positive. thus case 2 is satisfied

Case III: a=b
a=b $\implies$ n is zero, since (a+b) will be positive, and (a-b) will be zero, thus the product will be zero. this is also a contradiction to our assumption that n $\in$ $\mathbb{N}$ thus Case III isn't satisfied.

Therefore, since two of the cases yields in a contradiction, then our assumption that both n(n+1) and n(n+2) is a perfect square is false. Thus this proves that neither n(n+1) nor n(n+2) is a perfect square.

3

There are 3 best solutions below

0
On

Your proof is not valid.

Note that the negation of neither $n(n+1)$ nor $n(n+2)$ is a perfect square is at least one of $n(n+1)$ or $n(n+2)$ is a perfect square.

You have assumed both $n(n+1)$ and $n(n+2)$ are perfect square to get to a contradiction.

0
On

As already said in the comments, we need to show that "$n(n+1)$ is a perfect square or $n(n+2)$ is a perfect square" leads to a contradiction. So, say for example that $n(n+1)$ is a perfect square. This satisfies the "or" condition alone, so this must lead to a contradiction (and the same is true for $n(n+2)$).

Thus we have to prove that both conditions are, independently contradictory.

Suppose first that $n(n+1)=a^2$, for some $a\in \Bbb N$. Following lulu's hint, we know that $n$ and $n+1$ are relatively primes (since a common divisor of $n$ and $n+1$ must divide $1$). Therefore, in the prime factor's decompositions of $n$ and $n+1$ there are not common prime factors. Due to this, and since $n(n+1)=a^2$, looking to the prime decomposition of $a^2$, we conclude that every prime in the decomposition of $n$ and of $n+1$ appears with an even exponent. Thus, $n$ and $n+1$ are perfect squares too.

So, $n=p^2$ and $n+1=q^2$. This implies $1=q^2-p^2=(q+p)(q-p)$, which happens only if $q=1$ and $p=0$, and this hopefully is not the case if you suppose $0\notin \Bbb N$. ^^'

Now, the other case. Suppose $n(n+2)=a^2$. We see that any non-$1$ common divisor of $n$ and $n+2$ must divide $2$. Thus, $\mathrm{gcd}(n,n+2)\in \{1,2\}$. If it is $1$, just like above we get that $n=p^2$ and $n+2=q^2$, implying $2=q^2-p^2=(q+p)(q-p)\implies (q+p=2$ and $q-p=1)$ or $(q+p=1$ and $q-p=2$). Both cases give $2q=3$, which is impossible.

If $\mathrm{gcd}(n,n+2)=2$, then $n$ must be even, as well as $n+2$. So, if $n=2k$, we have that $2k(2k+2)=a^2\implies k(k+1)=a^2$, reducing the problem to the previous case.

1
On

Other people have pointed out flaws in your proof. Here's a sketched outline of a much simpler proof.

Let's suppose $x = n(n+1)$ (or $x = n(n+2)$) is a perfect square. In both cases, by multiplying out the brackets, it's easy to see that we have $n^2 < x < (n+1)^2$. So which natural number could $x$ possibly be the square of? It would have to be between $n$ and $n+1$ - but there aren't any.