What's so different about limits compared to infinitesimals?

2.9k Views Asked by At

If you find the limit is 2 for a given function, wouldn't this be the same as $2 + \epsilon$ with $\epsilon$ being a negligible value? This different way of defining limit-like behavior seems rigorous enough, but it took until Abraham Robinson in the 60s to really define the foundation for nonstandard analysis. My question is: what really is the difference?

1

There are 1 best solutions below

0
On BEST ANSWER

There were various theories extending the real numbers to include infinitesimals throughout the period from 1870 until 1960, but Robinson was the first to introduce a system that can be used in analysis. Earlier systems were critized by Klein, Fraenkel, and others on the grounds that they were not proven to satisfy the mean value theorem for, e.g., infinitesimal intervals. Robinson's framework satisfies this and more.