Incongruencies with derivatives and differencials

51 Views Asked by At

I read in Piskunov that the increment $\Delta y$ of a function can be written as:

$\Delta y = f'(x) \Delta x + \alpha \Delta x$

And, when ${\Delta x\to 0}$ ,

$dy=f'(x)dx$

The problem is, doesn't that mean that it is possible to write the derivative of $f$ as the quocient

$f'(x)=dy/dx$ ?

It is my understanding that a derivative can be written as the limit of a quocient

$\lim\limits_{h\to 0} \frac {f(x+h)-f(x)}{h}$

But not as a quocient of limits, because the denominator would be zero.

I started this month my first calculus course, and I am very confused by some of these details, specially with those that have to do with notions of infinitesimals. I also have a lot of doubts in regards to Leibniz notation, as it is repeatedly used as a quocient when, according to every source, $\frac {df(x)}{dx}$ is just notation for $f'(x)$, and not a quocient.

An example of this is the following deduction from my physics course (in portuguese)

enter image description here

Where de "denominator" is transfared.

Any light shed on this would be greatly appreciated.

1

There are 1 best solutions below

2
On BEST ANSWER

The answer is "no, it is not logically reasonable to interpret $dy/dx$ as a quotient, or at least not until you take a much more advanced course". Sorry if that's confusing.

(By the way, when you say "And, when $x \rightarrow 0$", you probably mean $\Delta x \rightarrow 0$).

Before we begin, remember the basic intuition is the following. Differentiable functions are those whose graphs are approximately linear when you look at a small enough region. Consider $f(x)=x^2$. Clearly, this "U" shaped graph is not linear. However, suppose you only looked at values of $x$ in the interval (1.99,2.01). This would look linear. If you looked only in the interval (1.999,2.001), then it would even be closer to being linear, and so forth. So $dy/dx$ is "almost" a quotient.

Note that not all functions have this property in all regions. For example, if you take $f(x)=|x|$ in an interval around $x=0$, this never looks linear; one always sees the "kink" at $x=0$. Incidentally, there are continuous functions which are not differentiable at any point ("nowhere differentiable functions").

Now, given this intuition, one wants a formal definition. Let us suppose that I claim that near $x=x_0$, the function $f$ is approximately linear with slope $m$. How badly wrong can I be? Well, the error would be $f(x) - f(x_0) - m(x-x_0)$. We want that to be small. How small? Well, perhaps small relative to how far $x$ is from $x_0$. So we now consider the ration $(f(x) - f(x_0) - m(x - x_0))/(x-x_0)$. Note that this ratio has a problem if $x = x_0$. Now, the concept of "small error" is formalized by asking that the above ratio tends to $0$ as $x$ tends to $x_0$. Recall the "delta-epsilon" definition of a limit. Thus, if I want the above fractional error to be 1%, I might require that $|x-x_0| < 0.00001$, and if I require that the fractional error be 0.1%, I might require that $|x-x_0| < 0.00000001$. In general, if I require that the fractional error be less that $\epsilon$, you can give me an appropriate $\delta$ such that if $|x-x_0| < \delta$, $|(f(x) - f(x_0) - m(x-x_0))/(x-x_0)|< \epsilon$. If you can't do this, then $m$ is not the derivative at $x_0$. If you can't do this for any $m$, then the function is not differentiable at $x_0$. Note that the choice of $\delta$ depends upon both the point at which the derivative is being sought ($x_0$) and the upper bound on the fractional error required ($\epsilon$).

Incidentally, note that the above definition is a mathematical abstraction is several ways. For example, in practical cases I might have a function which is not really differentiable because it has lots of microscopic "kinks", but it might be that for the range of values in which I am interested a linear approximation is useful and adequate. Alternatively, it might be that the relevant $\delta$ in the above function is so small that the linear approximation is not useful over the range of values of interest, even though the function is in principle differentiable. Additionally, note that for "physical" situations, it is often the case that the units of $f(x)$ are different from the units of $x$, so speaking of the ratio as being small is a bit goofy, but asking that the ratio tend to zero is still well defined.

So, if one defines $\Delta y = f(x) - f(x_0)$ and $\Delta x = x - x_0$ (holding $x_0$ constant temporarily), generally $\Delta y /\Delta x$ is not equal to $f'(x_0)$, but tends to that value in the limit.

Now, there are two modifications you will see when you study the subject more deeply.

First, it is possible to extend the real number system by adding infinitesimal numbers. This is a quite remarkable discovery which is relatively recent ("non-standard analysis") and I don't know much about it, except that the proof that the extension is logically consistent involves subtle issues from the foundations of set theory. However, I think that in non-standard analysis it is possible to interpret $dy/dx$ as an actual quotient, but of infinitesimals.

The second item is that in more advanced classes one will reinterpret the symbol $dx$ and $dy$ as "differential forms", and with this reinterpretation, one can write $dy = f'(x) dx$ in some sense exactly. However, this statement really needs to be read as something like "the linear approximation to the change in $y$ is proportional to the (linear approximation) in the change in $x$", so this is really repeating what we said above; locally the function is approximately, but not exactly, linear.

Have fun!