A simple proof of $\lim_{n\to \infty} \frac{\ln n}{n}=0$ for students of a high school

1.1k Views Asked by At

The question refers to the mathematics course for the students of a fifth scientific high school, whereas the order of the arguments of the textbook is almost identical to what I treated when I was studying at the university.

Here a graph of the arguments:

$$\color{brown}{\text{sequences}}\to \color{red}{\text{topology in }\Bbb R \text{ and }\Bbb R^2}\to\color{gray}{\text{limits of functions}}\to$$ $$\color{magenta}{\text{continuity}}\to \color{cyan}{\text{discontinuity}}\to\color{teal}{\text{derivates}}\to$$ $$\color{blue}{\text{max and min}}\to \color{green}{\text{study of functions in reals}}\to\color{orange}{\text{indefinite and definite integration}}$$ etc.

We suppose that we have this limit $$\lim_{n\to \infty} \frac{\ln n}{n}$$

it goes to $0$, because for $n\in \Bbb N$ large, I have $0\leq \ln n<n$, i.e. the natural logarithm sequence of $n$, i.e. $\{\ln n\}$ is much slower than the sequence $\{n\}$. i.e. we say that the sequence $\{n\}$ is predominant to $\{\ln n\}$; hence it is "similar" to have

$$\bbox[yellow,5px]{\lim_{n\to \infty} \frac{\text{constant}}{n}=0}$$

Is there an alternative clear proof that $$\lim_{n\to \infty} \frac{\ln n}{n}=0, \quad ?$$

Is there also something like: $$\bbox[orange,5px]{\lim_{n\to \infty} \frac{\ln(f(n))}{g(n)}}$$

Is there any known limit if $f(n)$ and $g(n)$ are two polynomials with $$\deg(f(n))\gtreqless\deg(g(n))\quad ?$$


8

There are 8 best solutions below

5
On BEST ANSWER

The standard proof I know for high-school consists in proving first (with derivatives) that $\;\ln x < \sqrt x\;$ for all $x>0$.

Then one deduces that, for all $n\ge 1$, $$0\le\frac{\ln n}n<\frac{\sqrt n}n=\frac1{\sqrt n},$$ and observes the latter expression tends to $0$.

For the last question, as far as I know, requires asymptotic analysis, namely finding an asymptotic equivalent for $f(n)$ and $g(n)$ (their leading monomials), but I don't think this is in the high school cursus in any country.

Edit: Actually, one may circumvent the use of asymptotic analysis,at the cost of a slightly longer proof. Let: \begin{align}f(n)&= an^k + \sum_{i=k-1}^0 a_ix^i,& g(n)&= cn^l + \sum_{i=l-1}^0 c_ix^i. \end{align} Then we may write $\ln (f(n)=\ln(an^k)+\sum_{i=k-1}^0\frac{a_i}{a x^{k-i}}$, so that $$\frac{\ln\bigl(f(n)\bigr)}{g(n)}=\frac{\ln\ a}{g(n)}+\frac{k\ln(n)}{g(n)},$$ and it is a simple routine to show that each of these fractions tends to $0$ when $n$ tends to $\infty$.

3
On

If $f(x)$ is a polynomial of degree $n$ then the derivative of $\ln f(x)$ is $f'(x)/f(x)$ which is a rational expression with the degree of the numerator one less than the degree of the denominator. Then by L'Hospital's Rule we have

$$\lim_{x\to \infty} \frac{\ln f(x)}{g(x)} = \lim_{x\to \infty} \frac{f'(x)}{f(x)g'(x)}.$$

The last is a rational expression with numerator degree smaller than denominator degree. So the limit is zero. (Assuming the degree of $g(x)$ is at least $1$.)

3
On

Comment on the Question

In the question, it is stated that since $0\le\log(n)\lt n$, we have $$ \lim_{n\to\infty}\frac{\log(n)}n=0 $$ By that same logic, since $0\le\frac12n\lt n$, we have $$ \lim_{n\to\infty}\frac{\frac12n}n=0 $$ which is false.


Part 1: Pre-Calculus Approach

Bound $\boldsymbol{\frac{\log(n)}n}$ $$ \begin{align} \left(1+\frac1{\sqrt{n}}\right)^n &\ge1+\sqrt{n}\tag{1a}\\[6pt] &\ge\sqrt{n}\tag{1b}\\[3pt] \left(1+\frac1{\sqrt{n}}\right)^2 &\ge n^{1/n}\tag{1c}\\ 2\log\left(1+\frac1{\sqrt{n}}\right) &\ge\frac{\log(n)}n\tag{1d} \end{align} $$ Explanation:
$\text{(1a)}$: Bernoulli's Inequality
$\text{(1b)}$: $1\ge0$
$\text{(1c)}$: raise to the $2/n$ power
$\text{(1d)}$: take the logarithm

Since $\frac{\log(n)}{n}\ge0$ and $\lim\limits_{n\to\infty}\log\left(1+\frac1{\sqrt{n}}\right)=0$, $(1)$ and the Squeeze Theorem imply $$ \lim_{n\to\infty}\frac{\log(n)}n=0\tag2 $$


Part 2: Apply to Polynomials

Let $P(n)=\sum\limits_{k=0}^ma_kn^k$ where $a_m\gt0$; that is, $\deg(P)=m$. Then $$ \frac{\log(P(n))}{\log(n)}=\frac{\log\left(a_mn^m\right)}{\log(n)}+\overbrace{\ \frac1{\log(n)}\ \vphantom{\sum_{k=0}^1}}^{\substack{\text{vanishes}\\\text{as }n\to\infty}}\overbrace{\log\left(1+\sum_{k=0}^{m-1}\frac{a_k}{a_m}\frac1{n^{m-k}}\right)}^{\text{vanishes as }n\to\infty}\tag3 $$ Therefore, $$ \begin{align} \lim_{n\to\infty}\frac{\log(P(n))}{\log(n)} &=\lim_{n\to\infty}\frac{\log\left(a_mn^m\right)}{\log(n)}\tag{4a}\\ &=\lim_{n\to\infty}\frac{\log(a_m)}{\log(n)}+\lim_{n\to\infty}\frac{m\log(n)}{\log(n)}\tag{4b}\\[6pt] &=0+m\tag{4c}\\[12pt] &=\deg(P)\tag{4d} \end{align} $$ Thus, if $\deg(Q)\ge1$, regardless of $\deg(P)$, $$ \begin{align} \lim_{n\to\infty}\frac{\log(P(n))}{Q(n)} &=\lim_{n\to\infty}\frac{\log(P(n))}{\log(n)}\lim_{n\to\infty}\frac{\log(n)}{n}\lim_{n\to\infty}\frac{n}{Q(n)}\tag{5a}\\ &=\deg(P)\cdot0\cdot\lim_{n\to\infty}\frac{n}{Q(n)}\tag{5b}\\[3pt] &=0\tag{5c} \end{align} $$ Explanation:
$\text{(5a)}$: limit of a product is the product of the limits
$\text{(5b)}$: apply $(2)$ and $(4)$
$\text{(5c)}$: if $\deg(Q)=1$, the limit is the reciprocal of the lead coefficient of $Q$
$\phantom{\text{(5c):}}$ if $\deg(Q)\gt1$, the limit is $0$

1
On

The fact that $\ln n<n$ is not sufficient to conclude that $\lim_{n\to\infty}\frac{\ln n}{n}=0$.

For instance, $n/2<n$ for every positive integer $n$, but $$ \lim_{n\to\infty}\frac{n/2}{n}=\frac{1}{2} $$

How to prove that limit depends on how you define the logarithm and the exponential. For instance, if you define $$ \ln x=\int_1^x \frac{1}{t}\,dt $$ you can observe that, for $0<x<1$ $$ -\ln x=\int_x^1 \frac{1}{t}\,dt=\int_x^{\sqrt{x}}\frac{1}{t}\,dt+\int_{\sqrt{x}}^1\frac{1}{t}\,dt\le(\sqrt{x}-x)\frac{1}{\sqrt{x}}+(1-\sqrt{x})=2-2\sqrt{x} $$ and therefore $$ 0\le-x\ln x\le 2x(1-\sqrt{x}) $$ so squeezing implies that $$ \lim_{x\to0}x\ln x=0 $$ (limit for $x\to0^+$, if you prefer). Therefore, with $y=1/x$, $$ \lim_{y\to\infty}\frac{1}{y}\ln(1/y)=\lim_{y\to\infty}-\frac{\ln y}{y}=0 $$ and therefore also $$ \lim_{n\to\infty}\frac{\ln n}{n}=0 $$ Other definitions require different proofs.

Generalizing, we can say that $$ \lim_{n\to\infty}\frac{\ln(n^k)}{n}=0 $$ If now $f$ is a polynomials of degrees $k>0$, we can write $$ f(n)=a_kn^k+a_{k-1}n^{k-1}+\dots+a_1n+a_0 =a_kn^k\Bigl(1+\frac{a_{k-1}}{a_kn}+\dots+\frac{a_1}{a_kn^{k-1}}+\frac{a_0}{a_kn^k}\Bigr) $$ and, in order that $$ \lim_{n\to\infty}\frac{\ln(f(n))}{n} $$ makes sense, we need that $a_k>0$. Moreover, the fact that $$ \lim_{n\to\infty}\Bigl(1+\frac{a_{k-1}}{a_kn}+\dots+\frac{a_1}{a_kn^{k-1}}+\frac{a_0}{a_kn^k}\Bigr)=1 $$ tells you that for $n$ greater than a suitable $\bar{n}$, we have $$ \frac{1}{2}<1+\frac{a_{k-1}}{a_kn}+\dots+\frac{a_1}{a_kn^{k-1}}+\frac{a_0}{a_kn^k}<2 $$ Therefore, for $n>\bar{n}$, $$ \frac{1}{2}a_kn^k<f(n)<2a_kn^k $$ and now you can squeeze and conclude that $$ \lim_{n\to\infty}\frac{\ln(f(n))}{n}=0 $$ If now $g$ is another polynomial of positive degree, you can write $$ \frac{\ln(f(n))}{g(n)}=\frac{\ln(f(n))}{n}\frac{n}{g(n)} $$ and the limit is obviously zero.

2
On

On the request of OP, I am writing an answer based on my comments to the question. The presentation is designed to be at an elementary level and avoids derivatives / integrals.


Natural logarithm is characterized by two key properties:

  • $\log (xy) =\log x+\log y, \, \forall x, y>0$
  • $\log x\leq x - 1,\,\forall x>0$

in the sense that any function $f:\mathbb {R} ^+\to\mathbb {R} $ which satisfies $f(xy) =f(x) +f(y), f(x) \leq x - 1$ is the natural logarithm.

We use the above properties to prove the desired limit in question. Let $n$ be a positive integer and then we have $$0\leq \log \sqrt{n} \leq \sqrt {n} - 1<\sqrt{n}$$ or $$0\leq \log n<2\sqrt{n}$$ or $$0\leq\frac{\log n} {n} <\frac{2}{\sqrt{n}}$$ Applying squeeze theorem we can see that $$\lim_{n\to\infty} \frac{\log n} {n} =0$$


Next let us use a definition of logarithm which establishes the fundamental properties described at the start of the answer. The simplest approach is using $\log x =\int_1^x\frac{dt}{t}$ but it needs a reasonable development of integral calculus (in particular one needs to show that the integral exists).

Instead we assume the convergence of bounded and monotone sequences and define $$\log x=\lim_{n\to\infty} n(x^{1/n}-1),\,\forall x>0\tag{1}$$ By definition we have $\log 1=0$. We show that the limit exists not just for $x=1$ (trivial case), but also for all other positive values of $x$.

We make use the following key inequalities $$\frac{a^r-1}{r}>\frac{a^s-1}{s},\frac{1-b^r}{r}<\frac{1-b^s}{s}\tag{2}$$ where $a, b$ are real numbers with $0<b<1<a$ and $r, s$ are rational numbers with $r>s>0$. The inequalities are established using algebraic manipulation in this answer.

Let $f(x, n) =n(x^{1/n}-1)$. Putting $r=1/n,s=1/(n+1)$ and $a=x$ if $x>1$ and $b=x$ if $0<x<1$ in $(2)$ we see that $f(x, n) $ is a decreasing sequence of $n$. For $x>1$ we can note that $f(x, n) >0$ and hence $f(x, n) $ tends to a limit as $n\to\infty $.

For $0<x<1$ we need a bit more work. Let $y=1/x>1$ and then $$f(x, n) =n((1/y)^{1/n} - 1)=-\frac{n(y^{1/n}-1)}{y^{1/n}}=-\frac{f(y, n)} {y^{1/n}} \tag{3}$$ Now $f(y, n) $ tends to a limit and $y^{1/n}\to 1$ so that $f(x, n) $ tends to a limit.

It is now established that the limit in $(1)$ exists for all positive real numbers $x$ and hence the logarithm function is well defined with domain $\mathbb {R} ^+$.

Next we have for positive real numbers $x, y$ \begin{align} \log (xy) &=\lim_{n\to \infty} n((xy) ^{1/n}-1)\notag\\ &=\lim_{n\to \infty} n(x^{1/n}y^{1/n}-y^{1/n}+y^{1/n}-1)\notag\\ &=\lim_{n\to \infty} y^{1/n}\cdot n(x^{1/n}-1)+n(y^{1/n}-1)\notag\\ &=1\cdot\log x+\log y\notag\\ &=\log x +\log y\notag \end{align} Putting $y=1/x$ in above relation we get $$\log x+\log (1/x)=\log 1=0$$ and hence $$\log(1/x)=-\log x\tag{4}$$ and further $$\log(x/y) =\log x+\log(1/y)=\log x - \log y\tag{5}$$ Next we assume $x>1$ and put $r=1,s=1/n,a=x$ in $(2)$ to get $$n(x^{1/n} - 1)\leq x-1$$ and taking limits we see that $$\log x \leq x - 1$$ for all $x>1$. The inequality holds trivially for $x=1$. For $0<x<1$ we apply $(2)$ with $r=1,s=1/n,b=x$ and get the same inequality as before. Thus we have established the fundamental inequality satisfied by $\log x$ namely $$\log x\leq x-1$$ for all positive real numbers $x$.

1
On

For me, the easiest proof uses $\ln(x) =\int_1^x \dfrac{dt}{t} $ for $x \ge 1$.

Then, for $x > 0$ and $0 < c < 1$,

$\begin{array}\\ \ln(1+x) &=\int_1^{1+x} \dfrac{dt}{t}\\ &=\int_0^{x} \dfrac{dt}{1+t}\\ &\lt\int_0^{x} \dfrac{dt}{(1+t)^c} \qquad\text{since }(1+t)^c < 1+t\\ &\lt\dfrac{(1+t)^{1-c}}{1-c}\big|_0^{x}\\ &=\dfrac{(1+x)^{1-c}-1}{1-c}\\ &<\dfrac{(1+x)^{1-c}}{1-c}\\ \text{so}\\ \dfrac{\ln(1+x)}{1+x} &<\dfrac{(1+x)^{-c}}{1-c}\\ &\to 0 \qquad\text{ as } x \to \infty\\ \end{array} $

You can use this to show that

$\dfrac{\ln(x)}{x^d} \to 0$ for $0 < d < 1$ as $x \to \infty$.

0
On

As an alternative to the fine answers already given we can also proceed as follows.

We can prove, more generally, that for $x \in \mathbb R$ we have $\lim_{x\to \infty} \frac{\ln x}{x}=0$.

Notably by $x=e^y$ with $y \to \infty$

$$\lim_{x\to \infty} \frac{\ln x}{x}=0 \iff \lim_{y\to \infty} \frac{e^y}{y}=\infty$$

and the the latter can be proved:

  • at a first stage by induction for $n\in\mathbb N$ showing that eventually $e^n\ge n^2$ and using squeeze theorem;
  • then extending the result to reals using that $\forall y\in \mathbb R \quad \exists n\in\mathbb N \quad y \in[n,n+1)$ such that $y\ge n$ and therefore using squeeze theorem:

$$\frac{e^y}{y}\ge \frac{e^n}{n} \to \infty$$

1
On

First prove that is mononotone decreasing.

By calculus $\ln (1 + x) < x, x>0$ so

$$ \ln(n+1)=\ln n+\ln\Bigl(1+\frac1n\Bigr)<\ln n+\frac1n. $$

$$ n \ln(n+1) < n \ln( n) +1 < (n+1) \ln n$$

$$ \frac{\ln(n+1)}{n+1} < \frac{\ln(n)}{n}$$

By definition of limit, now, you can just show that $\frac{\ln( n)}{n}$ can be in an arbitrarly small neighbourhood of $0$.

Given $K > 0$ taking $n>e^K$ we get

$$ \frac{\ln (n)}{n} < \frac{\ln (e^K)}{e^K} = \frac{K}{e^K} < K$$