How to calculate this limit without L'Hopital's rule

851 Views Asked by At

I know that in the very small values of $x$ $$(1+x)^n ≈1+ nx$$ and I can prove it using Taylor series.
But I wanted to prove it without any smell of derivative.

So... In order to calculate the following limit

$$\lim _{x \rightarrow\ 0} \frac{(1+x)^n-1} x, $$

I know that the result must be equal to $n$.

But is there any method without using l'Hospital's rule?

Note : n can be any real value not just for integers, so I didn't want to use binomial theorem.

6

There are 6 best solutions below

3
On

Hint

$$\dfrac{(1+x)^n-1}{x}=\dfrac{1+nx+\dots+nx^{n-1}+x^n-1}{x}=n+\binom{n}{2}x+\dots+nx^{n-2}+x^{n-1}.$$

0
On

Hint:

Let $u=1+x$. Then $$\lim_{x\to 0}\dfrac{(1+x)^n-1}{x}=\lim_{u\to 1}\dfrac{u^n-1}{u-1}. $$ Now expand $u^n-1$ and apply the limits.

4
On

Hint: $z^n - 1 = (z-1)(z^{n-1} + z^{n-2} + \cdots + 1)$. There are $n$ terms in the second factor, so it equals $n$ when $z = 1$.

0
On

Well (take it jokingly), since you want to go back the time past Taylor and Newton and Leibniz, please allow a stop in the Renaissance .
At that time $(1+x)^r$ was much in vogue as the formula of compound interest and it was of course well known that at low interest it could be simplified as $1+rx$, also for $r$ corresponding to a fractional period of time.
Then you might have had the chance to make acquaintance with J. Napier and H. Briggs and join them in discussing the best base to give to the logarithm, and choose that for which $ln(1+x) \approx x$.
But then you should have waited for some decades to learn from Jacob Bernoulli that such an optimal base was the limit of $(1+1/n)^n$.

0
On

Let $\alpha$ be a real number strictly between $0$ and $1$. Bernoulli's Inequality asserts that $$x^\alpha\leq 1+\alpha (x-1)$$ for each nonnegative real $x$.

There are a multitude of ways that one can establish this inequality. If you are adamant about avoiding even derivatives, you can accomplish this with continuity and the case where $\alpha$ is rational (which may be proven by the Arithmetic-Geometric mean which can be proven by...). If you want to avoid even continuity, you'd use least-upper-bounds and greatest-lower-bounds.

But let me assume that you have establish Bernoulli's Inequality at least. Then it's quite easy to establish from Bernoulli's that $$\alpha(1-x^{-1})\leq x^\alpha-1\leq \alpha(x-1)$$ for any nonnegative real number $x$ (so long as $\alpha$ is strictly between $0$ and $1$). The right-hand inequality is precisely Bernoulli's. The left-hand is arrived at by using $x^{-1}$ in Bernoulli's and then using the inequality $x+x^{-1}\geq 2$.

Of course, this inequality will prove your assertion for $\alpha$ strictly between $0$ and $1$. What do you when this is not the case? Choose $k$ in $\mathbb{Z}$ such that $0<\alpha/k<1$. Then use the prior case with $\alpha/k$ to conclude $$\frac{\alpha}{k}(1-(x^k)^{-1})\leq (x^k)^{\alpha/k}-1\leq\frac{\alpha}{k}((x^k)-1)$$ which, simplified, looks like $$\frac{\alpha}{k}(1-x^{-k})\leq x^\alpha-1\leq\frac{\alpha}{k}(x^k-1)\,.$$ The integer case will finish the proof from here.

0
On

If you take $e := \lim_{n\to\infty} (1+1/n)^n$, then one can show that $\lim_{n\to\infty} (1+y/n)^n = e^y$. In fact, one can show that $$ 1 + x \le e^x \quad\text{and}\quad e^y \le 1 + y + \epsilon(y), $$ where $\epsilon(y)$ is a term such that $\epsilon(y)/y \to 0$ as $y \to 0$. Importantly, this can all be done without the definition of a derivative; in particular, it is the type of question that can be given to a first year undergraduate in their first analysis course. Given this, one can upper bound $(1+x)$ by $e^x$ and then $e^{nx}$ by $1 + nx + \epsilon(nx)$. We then find that the limit is $n$. (Note that there $n$ is a fixed positive real---not necessarily an integer---and we take $x \to 0$; there is no "$n \to \infty$ as $x \to 0$".)