What's the importance of "infinitesimally small" whenever calculus is explained

1.2k Views Asked by At

First of all, i just like reading and understanding things related to math, but NOT at all any expert in math. So, I apologize if the question seems dumb.

It always puzzles me whenever a basic calculus is explained, it always make use of "infinitesimally" "very very small" etc. Why is it so important to use this term in calculus?

Similarly with "limits". The answer always seems blurry.. as it ends up with the notion of "the answer is say, as and as we go from point $a\to b$" etc.

Why can't there be a "clear and perfect" answer when doing problems in calculus. Like as i said, why something unclear like "infinite" or symbols like "$x \to 2$" (limits) etc needs to be used in Calculus for solving the problems.

V.

4

There are 4 best solutions below

2
On BEST ANSWER

I see calculus as being built from these "unclear" concepts, rather than forcing the use of them. The limit is the first and arguably most important concept in calculus, and the rest of the field is built on limits. Limits can be hard to completely understand, partly because they often involve very small ("infintessimal") and very large ("approaching infinity") numbers.

My mental picture for limits, infintessimals, and infinity is based on two people having a conversation about numbers. When something "increases without bound", "goes to infinity" or (poorly worded) "equals infinity", I imagine that anytime one of my imaginary people says a number, the other one says a larger number, then the first says an even larger number and so on. They will never stop going back and forth, and that is how I see it. I see infintessimals analogously with smaller and smaller numbers.

For limits, remember that there is a formal definition for a limit, which usually includes something like "for all $\epsilon$ there exists a $\delta$ such that," (now paraphrasing) when $x$ gets really close to $a$, $f(x)$ gets really close to some number $L$. How close is "really close"? That's what $\epsilon$ and $\delta$ are for. Now we go back to my two people. Here's the conversation:

A: This gets really close to $L$.

B: How close?

A: How close do you want it? Pick any number greater than zero.

B: Ok, how about this number $\epsilon$ I happen to have in my pocket?

A: That works. Every time you pull any $\epsilon$ out of your pocket, I will give you a $\delta$. You take my $\delta$ and push $x$ until it is less than $\delta$ far away from $a$. Once you've done that, I guarantee that $f(x)$ will be closer than $\epsilon$ distance away from $L$.

So person B keeps pulling out smaller and smaller epsilons and finding no matter how small they are, person A can always give back a $\delta$ that works.

In other words, no matter how close person B wants $f(x)$ and $L$ to get, person A can always guarantee the closeness, as long as person B is willing to put $x$ a certain distance closer to $a$. Their conversation continues forever, just like them going back and forth with larger and larger numbers to approach infinity. Therefore, person A has "proven" the closeness of $f(x)$ and $L$ to person B. Since you and I can't talk forever, I let my two imaginary friends do the talking for us and just skip to the end of their conversation.

Calculus is the part of math where we leave behind ideas that have one, simple answer. We are not asking "what is $513+2138756$?" We are asking more general questions like "if this goes on forever, where will it end up?" Once we start going to those imaginary places like "forever", we have to get some new concepts that are not as simple as the old ones.

Does that help at all?

2
On

I think this old but very entertaining book will help. The book is also freely available on the internet.

Silvanus Phillips Thompson (1851-1916), Calculus Made Easy, 2nd edition, MacMillan and Company, 1914.

0
On

In my opinion those symbols are clear and unambiguous. The problem is you probably weren't given a definition about what they mean, so you give them an intuitive meaning and that's why you find'em blurry. The same can be said about sentences like "very, very small" or "when $n$ is big". Those sentences have a clear and well defined mathematical meaning.

For instance in the context of sequeces, one often hears that "$\displaystyle \frac{1}{n}$ is essentialy $0$ when $n$ is big enough". The mathematical meaning of that sentence is $\displaystyle \left(\forall \epsilon \in \mathbb{R}^+\right) \left(\exists p\in \mathbb{N}\right) \left(\forall n\in\mathbb{N}\right) \left(n\ge p \Longrightarrow \left\vert \frac{1}{n}-0\right\vert\ <\epsilon \right)$ and it is often abbreviated by $\displaystyle \frac{1}{n} \to 0$, or by "$\displaystyle \frac{1}{n}$ converges to $0$". There's nothing unclear about this.

0
On

I'll narrow the scope of my answer to just be about "Differential Calculus". The proper objects of study in differential calculus are the derivatives of functions. Intuitively, we can say that differential calculus is the study of the "local" behavior of a function.

How can we define the concept of "local"? Clearly, we want to look at points close to the point of interest. But to make this definition precise, we must use some kind of limit. Using a fixed-size neighborhood just will not cut it, because we can zoom in arbitrarily and make any fixed-size neighborhood arbitrarily large.

So in the end, to discuss anything related to differentiation, we must use the concept of a limit in order to get a local picture of a function. To get a precise understanding of limits requires some technical mathematics, but I'm afraid that's the only way to build a solid foundation for calculus.

This concept of "local analysis" is part of a broad theme in Mathematics: to achieve a global understanding of an "object" by stitching together all the local pictures.