Treat Differentials As Fractions

307 Views Asked by At

I am readying in this website for a long time today similar questions and the answers provided about the issue I am asking now but I have to say I am more baffled than I was originally.

My original question would be:

I have read in many textbooks when they teach integration that they emphasize not to treat dx as a quantity but only as a notation that shows which is the variable of integrations. The same textbooks after a while they start talking about differentials and based on the following expressions

$$ df = f'(x)dx\quad df = f_{x}dx+f_{y}dy $$

they say that now that differentials have been introduced we can treat them as independent variables and use them freely as fractions.

Now I was always working with that idea, but after some digging I start seeing a lot of different answers, especially in here, and most of them were conflicting.

There were people mentioning Non-Standard Analysis and Differential Forms and were trying to justify that we can treat differentials as fractions. Also there were other people saying that every time we treat them as fractions we actually apply other theorems and so we don't actually treat them that way.

So the questions I have to ask now is:

  1. Can we actually treat differentials as fractions in EVERY framework, or is there today a debate about it?

    I don't want personal opinions or what we can do in a specific framework. I saw a lot of people mentioning that in single variable calculus you may treat them like that because you won't do mistakes. Can you actually treat them like that strictly mathematically in EVERY framework or you can just do it for ease but you actually apply different theorems behind the scenes?

  2. If differentials can't be treated in EVERY framework as a fraction then is it defined and proven that they can be always treated as a fraction in some frameworks?

  3. Given that we have defined differentials as

    $$ df = f'(x)dx\quad df = f_{x}dx+f_{y}dy $$

    then didn't we automatically defined than they can be used as independent quantities, so they can be seen as fractions also? Shouldn't we accept them as fractions from the moment of that definition existed?

3

There are 3 best solutions below

0
On

The reason the definition of a differential doesn't allow you to treat derivatives as fractions is because you're not automatically allowed to divide by everything you see in an equation. You can't divide by a three dimensional vector, for example, and this example is just like that. Differentials are vectors.

In nonstandard analysis in a single variable derivatives are the standard parts of quotients of infinitesimals. There's no disagreement about that. But the infinitesimals are not differentials.

While it's useful sometimes to treat derivatives as fractions, it can lead to error. There's a famous example, I forget the exact circumstances, where you have functions $u, v, w$ with $$\frac{\partial u} {\partial v} \frac{\partial v} {\partial w} \frac{\partial w} {\partial u} =-1$$ which proves that you can't treat multivariable derivatives as fractions and expect to always get the right answer.

17
On

This question comes a lot, and the best answer, to when you are new to learning calculus is to simply regard $\frac{dy}{dx}$ as non-sense which gives you a correct answer. Think of it as notation, with its own rules, that when you learn how to use it, will lead to the correct answer. Do not try to understand it.

I would suggest to learn calculus both the "non-sense" way and the precise way. The precise way you will avoid using this notation and instead rely on the properties of the derivative. This way you would be able to convince yourself that the trick-in-differential-notation leads to the answer that you would get if you did it precisely.

5
On

Short(ish) answer to these particular questions:

  1. Can we actually treat differentials as fractions in EVERY framework, or is there today a debate about it?

No, you can't ALWAYS treat differentials as fractions. But no, there is no debate today about it.

In some contexts - for example, differential geometry, nonstandard analysis - differentials (or infinitesimally small numbers) have a precise mathematical definition. In such contexts you can reason formally with $dx$ and $dy$ separate from $dy/dx$. That means the answer to your question (2) is "yes".

In many applications - particularly physics - thinking in terms of differentials is a very good way to turn intuition about a problem into mathematics.

You can see each of these responses in posts on this site.

Related: Why can't the second fundamental theorem of calculus be proved in just two lines?