Prove that $\sqrt{(1+1/n)}$ is irrational for all $n\in \mathbb{N}$

323 Views Asked by At

I have trouble solving this probably simple problem:

Prove that $\sqrt{1+1/n}$ is irrational for all $n\in \mathbb{N}$.

3

There are 3 best solutions below

1
On BEST ANSWER

Hint: $$ \sqrt{1+\frac{1}{n}}=\frac{\sqrt{n^2+n}}{n} $$ so this is equivalent to proving that $\sqrt{n^2+n}$ is irrational, which is simpler, because, for a positive integer $m$, $\sqrt{m}$ is rational if and only if $m$ is a perfect square (prove it).

Moreover $n^2+n>n^2$ and…

0
On

Note that $1+\frac{1}{n}=\frac{n+1}{n}$. Now suppose such a square root was rational, then there exist $a,b>0$ with $\gcd(a,b)=1$ such that $$\frac{a^2}{b^2}=\frac{n+1}{n}.$$ Now $\gcd(a^2,b^2)=\gcd(n+1,n)=1$ and since reduction of fractions to lowest terms happens in a unique way we obtain $a^2=n+1$ and $b^2=n$, but no two nonzero squares are one apart.

0
On

Here's another way.

$\sqrt{1 + \frac 1n} = \sqrt{\frac {n+1}{n}} = \sqrt{\frac {n(n+1)}{n^2}}= \frac 1n \sqrt{n(n+1)}$ which rational if and only if $\sqrt{n(n+1)}$ is rational.

Square roots of integers are rational only when that are integer square roots of perfect squares. So $\sqrt{n(n+1)}$ is rational only if $n(n+1)$ is a perfect square.

But if $n > 0$ (which it must be if $n$ is natural $\frac 1n$ is defined) then:

$n^2 = n*n < n*(n+1) < (n+1)(n+1) = (n+1)^2$.

If $n*(n+1) = k^2$ then $n < k < n+1$ and $k$ can not be an integer so $n(n+1)$ is not a perfect square.

[The exceptions are $n = 0$ or $n = -1$ then $n(n+1) = 0^2$. There reason the argument hold is if $n= 0$ then $n*n = n*(n+1) < (n+1)(n+1)$ so $n \le k < n+1$ so $k = n = 0$. or if $n = -1 < 0$ and $n+1 = 0$ $n*n > n*(n+1) = (n+1)(n+1)$ and so $k = n+1 = 0$.]

======

Ooooh..... this is cute:

Let $(1 +\frac 1n ) = \frac {a^2}{b^2}; a, b\in \mathbb Z; \gcd(a,b) = 1;b \ne 0$

Then $b^2(1 + \frac1n) = a^2$

$(b^2 - a^2) = \frac {b^2}{n}$.

$(b-a)(b+a) = \frac {b^2}{n}\in \mathbb Z$.

Let $p|\frac {b^2}{n}$ then $p|b$. But as $a,b$ are relatively prime. $p\not |b \pm a$ so $p\not \mid \frac {b^2}{n}$ after all and $\frac {b^2}{n} = 1$

$(b-a)(b+a) = 1$ so $(b-a) = (b+a) = \pm 1$ and $a = 0;(b = \sqrt{n})$ and $1 + \frac 1n = 0$. Which is not possible.