I am always struggling with infinitesimals, and not sure I'm getting this right. The title basically states the simplest version of my question: If a function has zero slope at some point, is it correct to say that $f(x)=f(x+dx)$ for an infinitesimally small $dx$? -- If yes, can anyone intuitively explain why I can drop the other terms of the Taylor expansion here?
Edit: It has been pointed out that "infinitesimal" really isn't a well-defined concept in the way that I've used it here, and the answer by Seth explains why. I've clarified what I'm after (without the use of "infinitesimals" in a separate question to avoid mixing too many different problems under one header.
Using the hyperreal number system defined by Robinson, which is a common context in which infinitesimals can be rigorously defined, a function $f$ is differentiable at $x$ if there is a real number $f'(x)$ such that $$f'(x)\approx\frac{f(x+\epsilon)-f(x)}{\epsilon}$$ for all infinitesimals $\epsilon$. Here, $a\approx b$ means that $a$ and $b$ differ by an infinitesimal.
Thus, if $f$ is differentiable at $x$, this implies that $$\frac{f(x +dx)-f(x)}{dx}=f'(x)+\eta$$ for some infinitesimal $\eta$, which implies that $$f(x+dx)-f(x)=dx\,(f'(x)+\eta).$$ Since $dx$ is infinitesimal, the right side is infinitesimal, and therefore, in our alternative notation, $f(x+dx)\approx f(x)$.
Interestingly, this is the definition of continuity: $f$ is continuous at $x$ if $f(x+\epsilon)\approx f(x)$ for all infinitesimals $\epsilon$. Thus the above argument shows that differentiability implies continuity.
The hypothesis $f'(x)=0$ was not used above. This stronger hypothesis means that $$\frac{f(x+dx)-f(x)}{dx}\approx 0,$$ which means intuitively that $f(x+dx)-f(x)$ is so small that even after dividing it by the infinitesimal $dx$, it is still infinitesimal.