Unexpected examples of natural logarithm

13.1k Views Asked by At

Quite often, mathematics students become surprised by the fact that for a mathematician, the term “logarithm” and the expression $\log$ nearly always mean natural logarithm instead of the common logarithm. Because of that, I have been gathering examples of problems whose statement have nothing to do with logarithms (or the exponential function), but whose solution does involve natural logarithms. The goal is, of course, to make the students see how natural the natural logarithms really are. Here are some of these problems:

  1. The sum of the series $1-\frac12+\frac13-\frac14+\cdots$ is $\log2$.
  2. If $x\in(0,+\infty)$, then $\lim_{n\to\infty}n\bigl(\sqrt[n]x-1\bigr)=\log x$.
  3. What's the average distance from a point of a square with the side of length $1$ to the center of the square? The question is ambiguous. Is the square a line or a two-dimensional region? In the first case, the answer is $\frac14\bigl(\sqrt2+\log\bigl(1+\sqrt2\bigr)\bigr)$; in the second case, the answer is smaller (of course): $\frac16\bigl(\sqrt2+\log\bigl(1+\sqrt2\bigr)\bigr)$.
  4. The length of an arc of a parabola can be expressed using logarithms.
  5. The area below an arc of the hyperbola $y=\frac1x$ (and above the $x$-axis) can be expressed using natural logarithms.
  6. Suppose that there is an urn with $n$ different coupons, from which coupons are being collected, equally likely, with replacement. How many coupons do you expect you need to draw (with replacement) before having drawn each coupon at least once? The answer is about $n\log(n)+\gamma n+\frac12$, where $\gamma$ is the Euler–Mascheroni constant.
  7. For each $n\in\mathbb N$, let $P_p(n)$ be the number of primitive Pythagorean triples whose perimeter is smaller than $n$. Then $\displaystyle P_p(n)\sim\frac{n\log2}{\pi^2}$. (By the way, this is also an unexpected use of $\pi$.)

Could you please suggest some more?

22

There are 22 best solutions below

5
On

The continuous solution of the functional equation $f(x\cdot y)=f(x)+f(y)$, with the condition $f'(1)=1$ is $f(x)=\ln (x)$.

Changing the value of $f'(1)$ we find the other logarithm functions.

6
On

What about the Prime Number Theorem? The number of primes smaller than $x$ is denoted by $\pi (x)$ and you have $$\pi (x) \sim \frac{x}{\log x}$$

3
On

Here are some of my favorites:

  • By "reversing" Euler's identity, $$\ln(\cos x+i\sin x)=ix$$

  • The natural log appears in some of the integrals of trigonometric functions: $$\int \tan (x) dx=\ln(\sec(x))+C$$ $$\int \cot (x) dx=\ln(\sin(x))+C$$ $$\int \sec (x) dx=\ln(\sec(x)+\tan(x))+C$$

  • The appearance of the natural logarithm in the Tsiolkovsky rocket equation: $$\Delta v=v_e\ln\frac{m_0}{m_f}$$

0
On

Consider phase transition in the Erdős-Rényi model $G(n, p)$. We have

The property that $G(n, p)$ has diameter two has a sharp threshold at $p = \sqrt{\frac{2\ln n}{n}}$.

That is, if $p$ is smaller than $\sqrt{\frac{2\ln n}{n}}$, then the probability that the diameter of $G(n, p)$ is greater than $2$ goes to $1$ in the limit, as $n$ goes to $\infty$; if $p$ is greater than $\sqrt{\frac{2\ln n}{n}}$, then the probability that the diameter of $G(n, p)$ is smaller than or equal to $2$ goes to $1$ as $n$ goes to $\infty$.

Another similar conclusion is

The disappearance of isolated vertices in $G(n, p)$ has a sharp threshold at $p = \frac{\ln n}{n}$.

0
On

Solve $x^n-x-1=0$ for various values of $n$ ($n\ge 2$). There will be one root greater than $1$ for each $n$. The asymptotic behavior of this root as $n$ increases without bound is given to two terms as:

$x=1+(\log 2)/n+o(1/n)$

4
On

Here's another one related to some of your examples: the $n$-th harmonic number

$$ H_n = 1 + \frac{1}{2} + \ldots + \frac{1}{n} $$

satisfies

$$ H_n \approx \ln(n) + \gamma $$

where $\gamma$ is the Euler-Mascheroni constant. The error in the above approximation is slightly less than $\frac{1}{2n}$.

1
On

Using $\sigma(n)$ as the sum of the (positive) divisors of a natural number $n,$ we have $$ \sigma(n) \leq e^\gamma \, n \, \log \log n + \frac{0.64821364942... \; n}{\log \log n},$$ with the constant in the numerator giving equality for $n=12.$ Here $\gamma = \lim H_n - \log n.$

As suggested by Oscar, we may write this without approximations as $$ \sigma(n) \leq e^\gamma \, n \, \log \log n + \frac{ n \; ( \log \log 12) \left(\frac{7}{3} -e^\gamma \,\log \log 12 \right)}{\log \log n}.$$

There are some numbers up to $n \leq 5040 \;$ (such as $n=12$) for which $ \sigma(n) > e^\gamma \, n \, \log \log n .$ The conjecture that, for $n > 5040,$ we have $ \sigma(n) < e^\gamma \, n \, \log \log n ,$ is equivalent to the Riemann Hypothesis.

Note that the occurrence of $\log \log n$ means that we cannot replace the natural logarithm by some other without changing the sense of the statement. We would not just be multiplying by a constant if we used a different logarithm.

4
On

In Calculus I the student learns how to find antiderviatives of $x^n$ for all integers $n \ne -1$. They scratch their heads and scream

"Give me the antiderivative of the inversion function $1/x$!"

OK you say, here it is:

$\ln(t)=\int _{1}^{t}{\frac {1}{x}}\,dx$

1
On

$$ \frac{d}{dx}\,(x^x) = x^x \ (\ln(x)+1) $$

0
On

How do you count connected labeled graphs on $n$ vertices?

Let's take the not-necessarily-connected case first. There are $\binom{n}{2}$ possible edges between the $n$ vertices, and for each you may include it or not. So there are $$2^\binom{n}{2}$$ possible graphs.

Now to count connected graphs, we need to do some "generatingfunctionology", to steal Wilf's term. Let $$f(x) = \sum_{n=0}^\infty 2^\binom{n}{2} \frac{x^n}{n!}$$ be the (formal) exponential generating function for labeled graphs. Then if $c_n$ is the number of connected graphs on $n$ vertices, we have

$$\sum_{n=1}^\infty c_n \frac{x^n}{n!} = \log f(x) = \log\sum_{n=0}^\infty 2^\binom{n}{2} \frac{x^n}{n!}.$$

This is astonishing the first time you see it, but it is very natural once you understand how exponentiation works on exponential generating functions.

0
On

This is more about $e$ than the natural logarithm, but I was surprised that the maximum of $x^{1/x}$ was at $e$.

That comes up in studying the equation $a^b = b^a$ for $a, b \in \mathbb{R}$ with $a\ne b$.

2
On

I found it quite remarkable that $$\int\frac{1}{x\log(x)\log(\log(x))}dx = \log(\log(\log|x|))$$ But more generally, if $\log^{\circ i}(x)$ means $\log\underbrace\cdots_{i\text{ times}}\log x$, then $$\int\frac{dx}{x\prod_{i=1}^n{\log^{\circ i}(x)}} = \log^{\circ n+1}|x|, n\in\mathbb{N}$$ Indeed, $${\mathrm d\over\mathrm dx}\frac{1}{\log\log\log\log|x|}=\frac{1}{x\log(x)\log\log(x)\log\log\log(x)}$$

5
On

Your first point can be generalized. Write $[a_1,a_2,a_3,\dots]$ for $\sum a_n/n$. You wrote:$$[\overline{1,-1}]=\ln2.$$(The bar means repeat.) Then we also have:\begin{align}[\overline{1,1,-2}]&=\ln3,\\ [\overline{1,1,1,-3}]&=\ln4,\end{align}and in general:$$[\overline{\underbrace{1,1,\dots,1}_{n-1},1-n}]=\ln n.$$


As a side note, one can see that $\ln m+\ln n=\ln mn$ from this. For example, note that, from the definition, we have $[\overline{0,2,0,-2}]=[\overline{1,-1}]=\ln2$ (from doubling the numerators and denominators). We then have:\begin{align}\ln2+\ln2={}&[\overline{1,-1,1,-1}]+\\&[\overline{0,2,0,-2}]\\{}=&[\overline{1,1,1,-3}]=\ln4\end{align} Similarly: \begin{align}\ln2+\ln3={}&[\overline{0,0,3,0,0,-3}]+\\&[\overline{1,1,-2,1,1,-2}]\\{}=&[\overline{1,1,1,1,1,-5}]=\ln6\end{align}

0
On

Here is one containing a lot of $\log$s.

Consider the standard multiplication table, but with rows and columns indexed by $1$ to $N$ instead of $1$ to $10$. The question is, how many distinct integers are there among these? Perhaps surprisingly, the answer is asymptotically less than $N^2$. Ford has shown that the answer is, asymptotically, $$\frac{N^2}{(\log N)^{c_1}(\log\log N)^{3/2}},$$ where $c_1=1-\frac{1+\log\log 2}{\log 2}$. Similarly, if we were to consider $k+1$ dimensional multiplication table (defined in the obvious manner) the number of distinct integers in it is $$\frac{N^{k+1}}{(\log N)^{c_k}(\log\log N)^{3/2}},c_k=\frac{\log(k+1)+k\log k-k\log\log(k+1)-k}{\log(k+1)}.$$

0
On

The law of the iterated logarithm states that $$ \limsup_{n \to \infty} \frac{X_1+\cdots+X_n}{\sqrt{n \log\log n}} = \sqrt 2 $$ almost surely, where $X_1,\ldots,X_n$ are iid random variables with means zero and unit variances.

2
On

Boltzmann's entropy equation:

$$S = k\ln{W}$$

3
On

Something I found some months back. Would be surprised if this hasn't been looked at before. No citations.

We say a set $S$ can express $n$ if it's possible to express $n$ as a potentially repeating sum of elements of $S$.

We say that a set $S$ is critical for $n$ if $S$ can express $n$ and no strict subset of $S$ can express $n$.

Let $u_n$ be the size of the largest subset of $\{1,2,\dotsc ,n\}$ that is critical for $n$. It's conjectured that $u_n$ grows like $\log_e n$.

Evidence: enter image description here

4
On

Perhaps the students would enjoy that the area of the unit circle may be expressed as $$ - \sqrt{-1} \log{(-1)} $$

3
On

$$\sum_{k=1}^{\infty} \frac{k \mod{j}}{k(k+1)} = \log{j}, \: \forall j \in \mathbb{N}$$

0
On

Related to what you have given yourself but I always found it interesting: $$\int_1^x\frac{dt}t=\ln(x)-\ln(1)=\ln(x)$$ This is perhaps one of the main definitions of $\ln$ though. We also have that: $$\Re\left[\ln(x+jy)\right]=\frac12\ln(x^2+y^2)$$


logs are also often useful in approximating equations, e.g.: $$y=c x^n$$ $$\ln(y)=\ln(c)+n\ln(x)$$ so plotting $\ln(x)$ vs $\ln(y)$ allows us to find values for $c,n$

0
On

One other that I found interesting:

If we have a unit square: $x,y\in[0,1]^2$ we can say that the average (expected) difference between two points in said square is: $$\int\limits_0^1\int\limits_0^1\int\limits_0^1\int\limits_0^1\sqrt{(x_1-x_2)^2+(y_1-y_2)^2}\,\text{d}x_1\text{d}x_2\text{d}y_1\text{d}y_2$$ Which is turns out is equal to: $$\frac{2+\sqrt{2}+5\ln(\sqrt{2}+1)}{15}$$

Source

0
On

The natural logarithm occurs often when analysing sorting and searching algorithms used in computer science. A famous example is the asymptotic formula of the average number of comparisons $Q_n$ of the Quick Sort algorithm. \begin{align*} \color{blue}{Q_n=2n(\ln n + \gamma -2)+2\ln n+2\gamma+1+O\left(\frac{1}{n}\right)} \end{align*}

Volume 3 of Knuth's classic The Art of Computer Programming is titled Sorting and Searching. It presents a wealth of applications of these two fundamental combinatorial themes and one gem is C.A.R. Hoare's Quicksort algorithm.

Quicksort is the standard sorting procedure in UNIX systems and has been cited as we can read in this paper by J.A. Fill as one of the ten algorithms with the greatest influence on the development and practice of science and engineering in the $20$th century.