Presumably, the transcendental number $e$ was first found by taking the power series solution to the (arguably most fundamental) differential equation $f'(x)=f(x)$, with the initial condition $f(0)=1$ and then plugging in $x=1$. My question is, is there any way, other than using the equivalent of the multinomial theorem for power series to demonstrate that $f(nx)=f^n(x)$ for all $n$ and $x$ (i.e., that $f(x)$ is an exponential function) other than this laborious and unaesthetic method? Intuitively, I would expect the solution to be an exponential function but is there a better, somewhat rigorous demonstration of this simple fact?
Derivation of the Exponential Nature of $e^x$
126 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 5 best solutions below
On
One option is to use the definition $$ e^x = \lim_{n\to\infty} \left(1 + \frac{x}{n}\right)^n. $$ To see that this satisfies the differential equation, notice that $$ \frac{d}{dx} \left(1 + \frac{x}{n}\right)^n = \left(1 + \frac{x}{n}\right)^{n-1} \approx \left(1 + \frac{x}{n}\right)^n. $$ (This can probably be made rigorous with some effort.)
Given this formula, for large $n$ we have $$ \left(1 + \frac{x}{n}\right)^n \left(1 + \frac{y}{n}\right)^n = \left(1 + \frac{x+y}{n} + \frac{xy}{n^2}\right)^n \approx \left(1 + \frac{x+y}{n}\right)^n. $$ (To make this rigorous, use the fact that $x+y+\frac{xy}{n} \approx x+y$.)
This gives $e^x e^y = e^{x+y}$.
On
Suppose $h$ and $g$ are non-zero differentiable functions which satisfy $h'=nh, g'=ng$
Then $$\left(\frac hg \right)'=\frac {nh'g-nhg'}{g^2}=0$$ Whence $h=Ag$ for some constant $A$, and if $h(0)=g(0)$ then $h=g$
So we have $\left(f(nx)\right)'=nf'(nx)=nf(nx)$ and $\left(f^n(x)\right)'=nf'(x)f^{n-1}(x)=nf^n(x)$
So with $h=f(nx); g=f^n(x)$ we have $h(0)=g(0)=1$ and therefore $h=g$.
Note this does not depend on $n$ being an integer.
On
First, the zero function is a solution to the first-order ODE $$f' = f.$$ So, if a solution has a root $x_0$, then $f'(x_0) = f(x_0) = 0$ and hence by the uniqueness in the Picard–Lindelöf Theorem (this uses that the identity map is Lipschitz) that solution is zero. Thus, by continuity, the solution to the given i.v.p., $$f' = f, \quad f(0) = 1,$$ which we suggestively denote $\exp$, has positive derivative everywhere and hence is bijective onto its image.
By the chain rule, the derivative of an invertible differentiable function $g$ is given by $$\frac{d}{da}(g^{-1}) = \frac{1}{g'(g^{-1}(a))},$$ and so the inverse of $\exp$, which we denote $\log$, then satisfies $$\log'(a) = \frac{1}{\exp'(\log(a))} = \frac{1}{\exp \log a} = \frac{1}{a}.$$ Integrating, and using the initial condition $f(0) = 1$ to determine the constant of integration, gives $$\log a = \int_1^a \frac{dt}{t}.$$ Using an elementary substitution gives \begin{align} \log a + \log b &= \int_1^a \frac{dt}{t} + \int_1^b \frac{dt}{t} \\ &= \int_1^a \frac{dt}{t} + \int_a^{ab} \frac{d(as)}{as} \\ &= \int_1^a \frac{dt}{t} + \int_a^{ab} \frac{ds}{s} \\ &= \int_1^{ab} \frac{dt}{t} \\ &= \log (ab) . \end{align} So, if we take $a = \exp x$ and $b = \exp y$, we can rearrange to produce the familiar identity $$\exp(x + y) = \exp(x)\exp(y).$$ In particular, $$\exp(nx) = \exp(n - 1) \exp(1),$$ so the result follows for nonnegative integers $n$ by induction. One can extend this to rational $n$ without too much fuss and then to all real $n$ by continuity.
On
There are some textbooks (e.g. Spivak, Calculus) that do it this way. Let $f : \mathbb R \to \mathbb R$ be the (unique) solution of
$$
f'(x)=f(x), \qquad f(0)=1. \qquad\qquad(*)
$$
The general solution of $u'=u$ is $u=Cf$ for some constant $C$.
Consider $f(x+y)$. For fixed $y$, write $g_y(x) = f(x+y)$. We have $$ g_y'(x) = \frac{d}{dx} f(x+y) = f'(x+y) = f(x+y)=g_y(x),\qquad g_y(0)=f(y) $$ so that $g_y(x)$ is a solution of $u'=u$. Therefore $$ g_y(x) = Cf(x), \\ g_y(x) = f(y)f(x), \\ f(x+y)=f(x)f(y) $$ This holds for all $x,y$. From this we may deduce $f(x)^n = f(nx)$ for positive integers $n$, then for all integers $n$, then for rational numbers $n$. Note $f(x) = (f(x/2))^2$, so $f$ has nonnegative values, making $f(x)^n$ well-defined for rational $n$.
note
A similar argument starts with equation $f''=f$ and proves the addition formulas for $\sin$ and $\cos$ like this.
I believe that $e$ was first found as $$ e=\lim_{n\to\infty}\left(1+\frac1n\right)^n\tag{1} $$ perhaps arising from compound interest problems and the like.
However, if we start with $f'(x)=f(x)$, then we have for any fixed $y$ $$ \begin{align} \frac{\mathrm{d}}{\mathrm{d}x}\frac{f(x+y)}{f(x)} &=\frac{f'(x+y)f(x)-f(x+y)f'(x)}{f(x)^2}\\ &=\frac{f(x+y)f(x)-f(x+y)f(x)}{f(x)^2}\\[6pt] &=0\tag{2} \end{align} $$ Therefore, $\frac{f(x+y)}{f(x)}$ is constant in $x$. In particular, $$ \frac{f(x+y)}{f(x)}=\frac{f(y)}{f(0)}\tag{3} $$ which, if we have $f(0)=1$, yields $$ f(x+y)=f(x)f(y)\tag{4} $$ Equation $(4)$ easily gives $f(nx)=f(x)^n$ by induction.