Why isn't the limit in the derivative's definition always equal to zero?

1k Views Asked by At

In the image linked above you can see what I mean. I get that it's not right and not supposed to be done, but I'm trying to understand why.

Also, why is it instead allowed to do this:

If the answer is obvious, feel free to call me an idiot.

Appreciate it, thanks!

5

There are 5 best solutions below

0
On BEST ANSWER

As some others have mentioned, you can split the limit of a product as a product of limits only in certain cases, one of which is when the limit exists. In the first case you gave, you implied that

$$\lim_{h \to 0} \frac{1}{h} = \infty$$

However, this is incorrect. The limit as $h$ tends to 0 from the right is positive infinity, but from the left it is negative infinity. Since the limits from both ends are different, the limit does not exist, so you can't split the product.

5
On

You seem to think that $0\times\infty=0$. Actually, $\times$ is defined only for numbers. Therefore, $0\times\infty$ is undefined.

On the other hand, as far as limits are concerned, if $\lim_{h\to0}f(h)=0$ and if $\lim_{h\to0}g(h)=\infty$, then the limit $\lim_{h\to0}f(h)g(h)$ may not exist and, if it does exist, it can be anyting. For instance, if $f(h)=kh$ and $g(h)=\frac1h$, then $\lim_{h\to0}f(h)g(h)=k$.

The last equality is correct whenever both limits exist (and are real numbers).

0
On

You can use the fact:

$$\lim_{h\rightarrow 0}f(h)g(h)=\left(\lim_{h\rightarrow 0}f(h)\right)\left(\lim_{h\rightarrow 0}g(h)\right)$$

ONLY when the individual limits exist, this is called the algebra of limits.

0
On

Because $\infty$ isn’t a defined value - it’s indeterminate.

$$\lim_{x\to a}f(x)\cdot g(x) = \lim_{x\to a}f(x)\cdot \lim_{x\to a}g(x) \text{ when both limits exist and are defined.}$$

In your case, you used the following limit: $$\lim_{h \to 0}\frac{1}{h} = \infty$$

Since it doesn’t reach a finite, defined value, it is not applicable here.

In the second example, the limit was correctly broken down.

$$\lim_{h \to 0}\frac{g(x+h)\cdot(f(x+h)-f(x))}{h} = \lim_{h \to 0}g(x+h)\cdot\lim_{h \to 0}\frac{f(x+h)-f(x)}{h} $$

Both limits are defined here, so this is correct. As $h \to 0$, the first limit approaches $g(x)$ and the second limit approaches the derivative of $f(x)$, or $f’(x)$. Therefore, the limit was correctly simplified, reaching $g(x)\cdot f’(x)$.

0
On

The key here is a clear and precise understanding of algebra of limits. The product rule of limits is generally proved in the textbooks with an assumption that the limits of both factors exist. It can be extended to the case when limit of one of the factors exists and is non-zero. Unfortunately neither the usual nor the extended version of product rule applies here as one of the factors tends to $0$ and the other factor has no limit. Hence the application of product rule in your first example is wrong.

In the second example limits of both the factors exist and hence the usual product rule can be applied.