Can such definition of infinitesimals hold?
$$\mathrm{d} x :=a:(a>0 \;\And\; \forall b \in \mathbb{R}^+\backslash \{ a \}\;(a<b))$$
And, if the above definiton works, then obviously
$$\mathrm{d} f(x) := f(x+{\mathrm{d} x})-f(x)$$
This definiton is basically a try not to delve into the realms of non-standard analysis and other sophisticated branches of mathematics, which seems redundant for me for defining such an intuitive concept.
The original problem with the infinitesimal approach to analysis/calculus was the existence of these objects. If $a = \mathrm{d}x$ were a real real number, then so would $\frac{a}{2}$. As $a > 0$, basic arithmetic says that $0 < \frac{a}{2} < a$, which means that $a$ cannot have the defining property of being an infinitesimal!
What the sophisticated machinery of Abraham Robinson did was establish a logically consistent and rigourous footing for the intuitive notion of an infinitesimal. After this was done, one can do non-standard analysis/calculus without worrying at all about the inner details. (Much like most mathematicians think of the real number $\pi$ as a single discrete object, when it is actually an infinite set of rational numbers! Or maybe it is an equivalence class of Cauchy sequences of rational numbers. I forget.)