Intuitive reason why $\sqrt[n]n\to 1$ as $n\to\infty$?

2.1k Views Asked by At

We are aware of the limit $$ \lim_{n\to\infty}\sqrt[n]n = 1; $$ is there any geometric or otherwise intuitive reason to see why this limit holds?


Edit: I am adding some context, since this question was previously put on-hold, and I think one of the main reasons was that it was poorly motivated. From theorem 8.1 of Baby Rudin, suppose the series $$ \sum_{n=0}^\infty c_nx^n $$ converges for $|x|<R$, and define $$ f(x) = \sum_{n=0}^\infty c_nx^n \qquad (|x|<R). \tag{1} $$ Among other conclusions, the function $f$ is differentiable in $(-R,R)$, and $$ f'(x) = \sum_{n=0}^\infty nc_n x^{n-1} \qquad (|x|<R). \tag{2} $$ Rudin uses the fact that $\sqrt[n]n\to 1$ as $n\to\infty$ to justify that the series in $(1)$ and the series in $(2)$ have the same radius of convergence. I recognized the limit, but it is just such a nice combination of $n$ and the $n$th-root, that I thought there ought to be some nice intuitive way to understand it, hence this question.

15

There are 15 best solutions below

3
On

$$\lim_{n \to \infty} n^{1/n} = L$$ $$\lim_{n \to \infty} \frac{\ln(n)}{n} = \ln(L)$$ $$0 = \ln(L)$$ $$L = 1$$

But alternatively,

$$\lim_{n \to \infty} n^{1/n} = \lim_{u \to 0^+} (\frac{1}{u})^u$$

Which, at zero is usually intuitively given as $\frac{1}{1}$.

3
On

$$n^{1/n} = \exp\left(\frac{\log(n)}n\right)$$ $\log(n)$ grows, but very slowly, slower than $n$ or any positive power of $n$. So $\log(n)/n \to 0$ as $n \to \infty$, and $n^{1/n} \to \exp(0) = 1$.

0
On

You can write the limit like this $\lim\limits_{n→∞}\sqrt[n]{n}=\lim\limits_{n→∞}e^{\frac{ln(n)}{n}}=e^{\lim\limits_{n→∞}\frac{ln(n)}{n}}$ the limit of $\frac{log(n)}{n}$ is zero, and $e^0=1$ .

0
On

Here the issue is that $n \rightarrow \infty$, but for any fixed $x > 0$ we have $x^{1/n} \rightarrow 1$. So to consider $n^{1/n}$ you have to ask which "wins": the $n$ at the base or the $1/n$ in the exponent. To think about this, it might help to compare to, say, $(2^n)^{1/n}$. This tends to (and is equal to) $2$. The exponent of $1/n$ has the power to take a huge number like $2^n$ and reduce it to a constant. Since $n$ is much, much smaller than $2^n$, you might expect then that the power of $1/n$ would "win" in the end, giving a result of $1$. This is not a proof, but it gives the right intuition.

0
On

The intuition is that the "$n$" in the index of the radical is dominant when compared to the "$n$" in the radicand.

So, for example, suppose $n=10^6$. One million is a large number. What happens if we take its millionth root [ie., $(10^6)^{10^{-6}}$]? Well, let's do this one step at a time.

If you take the square root of $10^6$, it is downsized considerably, and you get $1000$.

If you take its cube root, you get an even smaller number, namely $100$.

If you take its sixth root, you get $10$.

Well, we still have $999994$ "roots" to go. Hence, it's fairly reasonable to assume that after all of these "roots", we get a number quite close to $1$. And, as it so happens, as we take larger and larger $n$, we can make $n^{\frac 1n}$ arbitrarily close to $1$.

1
On

The question asks for an intuitive reason for the value of the limit $L$ if it exists. Look at a slightly more general limit.

Set $f(n):=\sqrt[n]{n^k}$ for some integer $k>0$ and assume $f(n) \to L$ as $n\to\infty$.

So $f(n)^n = n^k$ and we need $L^n \approx n^k$. Now $L<1$ is false since $L^n<1<n^k$ and is decreasing. So suppose $L>1$. Then we have $L^n \approx n^k$ as $n$ gets big.

But we know that exponential growth exceeds any polynomial function of $n$ such as $n^k$, so $L=1$.

1
On

One way to see this is: for example, $(1.01)^n$ grows faster than $n$, so for large enough $n$, $n$ will eventually be less than $(1.01)^n$. Once you have reached this point, you will have $1 < n^{1/n} < 1.01$. Now, just replace $1.01$ with $1 + \epsilon$ for any $\epsilon > 0$, and you will have a proof of the limit.

To reduce the first statement even further: let $a_n := \frac{n}{(1.01)^n}$. Then $\lim_{n\to\infty} \frac{a_{n+1}}{a_n} = \lim_{n\to\infty} \frac{1 + 1/n}{1.01} = \frac{1}{1.01} < 1$. Therefore, suppose $N$ is such that $\frac{a_{n+1}}{a_n} < 0.995$ for $n \ge N$; then for $n > N$, $0 < a_n < a_N (0.995)^{n-N}$, so by the squeeze theorem, $\lim_{n\to \infty} a_n = 0$. So, the intuitive point here is: even though the base of the exponent is only marginally greater than 1, it eventually starts making $a_n$ decrease approximately like a geometric sequence with ratio $\frac{1}{1.01}$.

(Of course, the other answers expressing $n^{1/n}$ as $e^{\frac{1}{n} \ln n}$ will give a much better idea of how fast $n^{1/n}$ approaches 1: namely, $n^{1/n}$ is approximately equal to $1 + \frac{\ln n}{n}$ for large $n$.)

0
On

If you know the Arithmetic-Geometric Mean Inequality, then you can see that

$$1\le\sqrt[n]n\le{\sqrt n+\sqrt n+1+\cdots+1\over n}={2\sqrt n+(n-2)\over n}=1+{2\over\sqrt n}-{2\over n}$$

and the Squeeze Theorem gives the limit $\sqrt[n]n\to1$.

0
On

Let's take the obvious $\displaystyle \sqrt[n]{c^n}=c$.

But we know the following facts :

  • for $0\le a<1$ then $a^n\to 0$ and $n\gg a^n$
  • for $b>1$ then $b^n\to+\infty$ and $n\ll b^n$

So in fact the squeeze $\mathbf{a^n\ll n\ll b^n}$ becomes when taking the $n-$root : $\quad\mathbf{a\le \sqrt[n]{n}\le b}$

For any $a<1$ and any $b>1$ which coerce it to be $1$.


As you can notice the gap between $a^n\ll ...\ll b^n$ is huge and many sequences can fit there, not only $n$, but $n^2, n^{100}$ or any polynomial as well, making $\sqrt[n]{\ }$ a powerful reducing tool, in the same way $\ln(x^k)=k\ln(x)$ flattens powers.

This in indeed what happens in the theoretical $\displaystyle \sqrt[n]{n^k}=e^ {\frac 1n\ln(n^k)}=e^{k\overbrace{\frac{\ln(n)}{n}}^{\to 0}}\to 1$

0
On

If "intuitive reason" contains arguments which are natural, then the following may be of interest.

A recurring argument about convergence of sequences are through subsequences. For example, to see that $\sqrt[n]{a}$ must converge to $1$ if it converges at all (which it does, since it is decreasing) notice that $\sqrt[2n]{a}$ must converge to the same limit $L$, since it is a subsequence. But $(\sqrt[2n]{a})^2=\sqrt[n]{a}$, and therefore $L^2=L \implies L=1$, since it obviously cannot be $0$.

This is just to exemplify a natural argument which commonly applies. Trying to apply this directly in the case $n^{1/n}$ (taking the subsequence $2n$) has problems since the $\sqrt[n]{2}$ will become $1$. Therefore, we must take a subsequence which is increasing faster. It is natural to consider $n^2$. We then have that, if $n^{1/n}$ converges, then $n^{2/n^2}=((n^{1/n})^{1/n})^2$ converges to the same limit. But since $n^{1/n}$ is assumed to be convergent, we have that for sufficiently large $n$: $$1<((n^{1/n})^{1/n})^2 < ((L+1)^{1/n})^2.$$ By the squeeze theorem, $n^{2/n^2}$ then converges to $1$, and hence so does $n^{1/n}$. In fact, this is a full proof, except that we are supposing that $n^{1/n}$ converges (but it does, since this sequence is eventually decreasing).

PS: This argument has sparkles of the philosophy of the Cauchy condensation test.

2
On

\begin{align} & \underbrace{2\times \cdots\cdots\cdots\cdots\times 2}_{\Large n \text{ factors}} \gg n. \\[15pt] \text{Therefore } & \frac n {\quad\underbrace{2\times \cdots\cdots\cdots\cdots\times 2}_{\Large n \text{ factors}}\quad} \approx 0 \text{ when } n\approx\infty. \\[15pt] \text{Let } m = 2^n. \text{Then } & \frac{\log_2 m} m \approx 0 \text{ when } m \approx\infty \\[10pt] & 2^{(\log_2 m)/m} \approx 1 \\[10pt] & m^{1/m} \approx 1. \end{align}

0
On

The simplest proof I know of uses only Bernoulli's inequality (which is simpler than the AM-GM inequality):

$(1+n^{-1/2})^n \ge 1+n^{1/2} \gt n^{1/2} $ so, raising to the $2/n$ power, $n^{1/n} \lt (1+n^{-1/2})^2 =1+2n^{-1/2}+n^{-1} \le 1+3n^{-1/2} $.

0
On

I see it as $\sqrt[n]{n}$ is decreasing, because its derivitaive is $\frac{1}{n^2}\sqrt[n]{n}\left(1-\ln(n)\right)<0$. And $1$ is a lower bound. So there is a limit $L$, and question is whether $L$ is something greater than $1$. If the limit were $L>1$, then for large $n$ you have $$\sqrt[n]{n}= L+\varepsilon$$ $$n=(L+\varepsilon)^n>L^n$$ and we can't have a linear $n$ exceeding an exponential with base greater than $1$.

2
On

Change $n=2^m:$ $$\lim_\limits{n\to\infty} n^{\frac{1}{n}} =\lim_\limits{m\to\infty} \left({2^m}\right)^{\frac{1}{2^m}}=$$ $$2^{\lim_\limits{m\to\infty} \frac{m}{2^m}}=2^0=1.$$

2
On

As a child you couldn't resist calculating the sequence $\{1,2,4,8,16,\dots\}$ in your mind, fascinated by the pattern of growth.

Years later, you are staring at $$ \tag 1 \lim_{n \to \infty} n^{1/n} = \,? $$ You know that for all $n \ge 0$, $n \le 2^n$, and as $n$ grows, you feel awkward even looking at the inequality—there is no comparison!

So, $(n)^{1/n} \le {(2^n)}^{1/n} = 2$, and if the sequence (1) converges it has to be between $1$ and $2$.

You now look at $$ \tag 2 (n)^{1/n} \le {(s^n)}^{1/n} = s $$ with $1 \lt s \le 2$.

You realize that if (2) holds true for sufficiently large $n$, then you've squeezed the convergence of (1) further, between $1$ and $s$. You let $s = 1.5$ and $n = 4$ and now 'you know' that the sequence (1) converges to $1$.


Proof

Let $0 \lt p \le 1$ be fixed. If we can show that $$ \tag 3 n \le (1 + p) ^ n $$ for all large $n$, then we proved that (1) converges to $1$.

For fun, we prove the case $p = 1$ again. The second term of the binomial expansion of the RHS of (3) is exactly equal to $n$, so not much to do there.

If $p \lt 1$, the second term (= $np$) will not work. So, hoping for the best, we examine the third term:
\begin{align*} n &\le \frac{1}{2} n(n-1) p^2 \text{ iff} \\ 2 &\le (n-1) p^2 \text{ iff} \\ n &\ge \frac{2}{p^2} + 1. \end{align*} The proof is complete.