Show $\lim_{h\to 0} \frac{(a^h-1)}{h}$ exists without l'Hôpital or even referencing $e$ or natural log

4.6k Views Asked by At

Taking as our definition of exponentiation repeated multiplication (extended to real exponents by continuity), can we show that the limit

$$\lim_{h\to 0}\dfrac{a^h-1}{h}$$

exists, without l'Hôpital, $e$, or even natural logarithm? Sure, l'Hôpital will work, but that's circular if we're developing calculus of transcendentals from first principles. There is a good answer to this question already by user Neal, but he uses the exponential function with base $e$ (it's been answered many times: see also here, here, and here).

But using the special properties of $e$ strikes me as circular too; not literally logically circular, in the sense that we are invoking results we're trying to prove, since there are definitions of $e^x$ which make it trivial to verify the derivative. But perhaps pedagogically circular for a complete novice, the special properties of $e$ appear unmotivated because they cannot be justified without reference to the very derivative we are trying to compute (or else a detour through logarithms, but let's not).

Can we find nice squeeze theorem bounds like Neal has, but for the function $a^x$ instead of $e^x$, with the additional handicap that we can't just write $a^x=e^{x\log a}$? I thought to substitute a series expansion for $\log a$, but didn't come up with any bounds that were nicely polynomial in both $x$ and $a$.

I wonder whether the geometric proof of $\lim (\sin x)/x$ (see for example, robjohn's answer here) could be adapted.

Obviously without a reference to natural logarithm, we cannot compute the value of the limit. But I just want to show it exists (via squeeze theorem or monotone convergence). Once we know this limit exists, we can show it behaves like a logarithm, whence there is a unique base for which the limit is 1, which we call $e$. The rest of the development of calculus of exponentials and logs follows easily. This seems like the approach that would appear the most accessible yet motivated to a novice calculus student.

An analogous limit to $\lim\dfrac{a^h-1}{h}$ for understanding to differentiating exponential functions, are the limits $\lim\limits_{n\to\infty} (1+\frac{1}{n})^n$ and $\lim\limits_{n\to\infty} (1+\frac{1}{n})^{n+1}$ for differentiating the logarithm, if you prefer to start with that as your primitive concept. Both limits are shown to exist using Bernoulli's inequality (see WimC's answer here for the first limit, and see David Mitra's answer here for the second limit). I tried without success to use Bernoulli's inequality to show my sequence was monotone. This limit can also be analyzed using the AM-GM inequality as seen in user94270's answer to this question. So that inequality may help here.

I would also accept an explanation of why the limit cannot be computed without transcendental techniques, or an opinion why this is not a pedagogically sound approach to introducing the calculus of exponentials and logarithms.

Edit: This question has a nice solution by Paramanand Singh to a closely related problem.

5

There are 5 best solutions below

3
On BEST ANSWER

Let $a>1.$ I assume $a^x$ is continuous, and that the basic exponent law $a^{x+y}=a^xa^y$ holds.

Claim: $a^x$ is convex on $[0,\infty).$ Proof: Because $a^x$ is continuous, it suffices to show $a^x$ is midpoint convex. Suppose $x,y\in [0,\infty).$ Using $(uv)^{1/2} \le (u+v)/2$ for nonnegative $u,v,$ we get $a^{(x+y)/2} = (a^{x} a^{y})^{1/2} \le (a^x+a^y)/2.$

Now if $f$ is convex on $[0,\infty),$ then $(f(x)-f(0))/x$ is an increasing function of $x$ for $x\in(0,\infty).$ This is a simple and easily proved property of convex functions.

Claim: $\lim_{x\to 0^+}(a^x-1)/x$ exists. Proof: All of these quotients are bounded below by $0.$ As $x$ decreases to $0,(a^x-1)/x$ decreases by the above. Because of the lower bound of $0,$ the limit exists.

It follows that $\lim_{x\to 0}(a^x-1)/x$ exists. This follows from the above and the fact that if $x>0,$ then $a^{-x} = 1/a^{x}.$ To handle $0<a<1,$ look at $[(1/a)^x-1]/x$ to see $\lim_{x\to 0}(a^x-1)/x$ exists.

1
On

I'll sketch one way. Assume that $a > 1$. (The case $a < 1$ can be done similarly.)

  1. Show that $f(x) = a^x$ is continuous (in fact continuous from the right is enough) and strictly increasing.
  2. Every continuous monotone function is differentiable almost everywhere. (This is a well-known, but not entirely trivial theorem.)
  3. Show that $f(x+y) = f(x)f(y)$. Thus, if $f$ is differentiable at some point $x=x_0$, then $f$ is also differentiable at $x=0$, which shows that your limit exists.
0
On

This addresses showing that $\lim c_n=\lim n(a^{1/n}-1)$ exists. Let $u=a^{1/n}$ and apply the identity $u^n-1=(u-1)(1+u+\cdots + u^{n-1}).$ Then after multiplying numerator and denominator of $c_n$ by the second factor of that identity, one gets to $c_n=(a-1)/I_n,$ where $$I_n=\frac{1}{n} [ a^{0/n}+a^{1/n}+\cdots +a^{(n-1)/n}],$$ which is a left endpoint Riemann sum for the integral $\int_0^1 a^x \ dx,$ so in the limit converges to that integral, showing as desired the limit exists.

Typically at least, a calc 2 student would be familiar with left hand Riemann sums and their convergence to the associated integral.

6
On

If $a = 1$ the limit is obviously $0$. Let $a > 1$ and $0 < b < 1$. Using simple algebra it is easy to show that $$\frac{a^{r} - 1}{r} > \frac{a^{s} - 1}{s},\,\,\frac{1 - b^{r}}{r} < \frac{1 - b^{s}}{s}\tag{1}$$ where $r, s$ are positive rationals with $r > s$. (see equation $(11)$ of this post). On OP's request I am providing the proof of the above inequality here itself.

First let's us assume that $r, s$ are positive integers. Clearly we know that $a^{i} < a^{r}$ for all $i = 0, 1, 2,\dots, r - 1$ and hence on adding these equations we get $$1 + a + a^{2} + \dots + a^{r - 1} < ra^{r}$$ and multiplying the above equation by $(a - 1) > 0$ we get $$a^{r} - 1 < ra^{r}(a - 1) = ra^{r + 1} - ra^{r}$$ or $$(r + 1)a^{r} - r - 1 < ra^{r + 1} - r$$ or $$(r + 1)(a^{r} - 1) < r(a^{r + 1} - 1)$$ so that we have finally $$\frac{a^{r} - 1}{r} < \frac{a^{r + 1} - 1}{r + 1}$$ It thus follows that the sequence $t_{n} = (a^{n} - 1)/n$ is strictly increasing and hence $t_{r} > t_{s}$ for positive integers $r,s$ with $r > s$. This proves the first inequality of $(1)$ with the restriction that $r, s$ are positive integers. The second inequality dealing with $0 < b < 1$ can be proved similarly starting with $b^{i} > b^{r}$ for all $i = 0, 1, 2, \dots, r - 1$.

Next we extend the inequality $(1)$ to the case when $r, s$ are positive rational numbers with $r = p/q, s = m/n$ where $p, q, m, n$ are positive integers and $r > s$ so that $np > mq$. Let $c = a^{1/nq}$ so that $c > 1$ and therefore via inequality $(1)$ (with restriction of positive integral indexes) we get $$\frac{c^{np} - 1}{np} > \frac{c^{mq} - 1}{mq}$$ Multiply the above equation by $nq > 0$ we get $$\frac{a^{r} - 1}{r} > \frac{a^{s} - 1}{s}$$ so that the inequality $(1)$ is proved for the case when $r, s$ are positive rationals.

If we extend the definition of $a^{x}$ to real exponents $x$ by any method (like one using Dedekind cuts suggested by OP), we see that the above inequalities hold true even if $r, s$ are positive reals with $r > s$. However any such procedure is effectively based on limiting processes and hence we can only obtain the weaker versions of the above inequalities in this manner. We are however lucky that we only need the weaker version here. Thus we have $$\frac{a^{r} - 1}{r} \geq \frac{a^{s} - 1}{s},\,\,\frac{1 - b^{r}}{r} \leq \frac{1 - b^{s}}{s}\tag{2}$$ where $r, s$ are real numbers with $r > s > 0$ and $a, b$ are real numbers with $a > 1 > b > 0$. It is now obvious that for $a > 1$ the function $f(x) = (a^{x} - 1)/x$ in increasing in $(0, \infty)$. Also $f(x) > 0$ for all $x$. It thus follows that $f(x) \to L$ as $x \to 0^{+}$ and moreover $L \geq 0$.

If $x < 0$ so that $x = -y$ and $y > 0$ then we can see that $$f(x) = f(-y) = \frac{a^{y} - 1}{ya^{y}} = \frac{a^{y} - 1}{y}\frac{1}{a^{y}} \to L$$ as $y \to 0^{+}$. It thus follows that $f(x) \to L$ as $x \to 0$ and $L \geq 0$.

If $0 < a < 1$ then we can use the ineuqalities related to $b$ above and show that $f(x) \to L$ and $L \leq 0$ as $x \to 0$. With slightly more effort we can show that $L = 0$ if and only if $a = 1$. The existence of this limit for all $a > 0$ defines a new function of $a$ which is normally called the logarithm of $a$. From here we can develop a theory of exponential and logarithmic function. See more details in this post.

4
On

The claim by OP,

But using the special properties of $e$ strikes me as circular too; for a complete novice, the special properties of $e$ cannot be justified without reference to the very derivative we are trying to compute (or else a detour through logarithms, but let's not).

is not true. Those properties can be constructed and proved without being circular (in a hard way!). I'll refer you to Principle of mathematical analysis by Walter Rudin (i.e. Baby Rudin) for the detail.

In page 22 exercise 6, $b^x$ with $b>1$ is defined the Dedekind cut way OP gave in the comment.

In page 63-64 the constant $e$ is defined to be $\displaystyle \sum_{n=0}^{\infty} \frac{1}{n!}$, and $\displaystyle e=\lim_{n\rightarrow \infty} (1+\frac{1}{n})^{n}$ is proved.

In page 178-179, a "special function" $E(z)$ is defined to be $\displaystyle \sum_{n=0}^{\infty} \frac{z^n}{n!}$ and a proof of $E(x)=e^x$ for any real number $x$ is given.

Finally, once we have power series, the limit is easy to compute.

Issues to consider

  1. Why does $\displaystyle \sum_{n=0}^{\infty} \frac{1}{n!}$ converge? You can prove the sequence of partial sum is increasing and is bounded above by $3-\frac{1}{n}$ (by induction), therefore bounded above by 3, and therefore converge.
  2. You will need to use ratio test or root test to show $E(x)$ absolutely converge for any $x$ and is therefore well defined for any $x$.
  3. To prove $E(x)E(y)=E(xy)$ (An important step of proving $E(x)=e^x$) you'll need properties of absolutely convergent power series.