Is there a way to justify the manipulations according to Leibniz-notation without nonstandard-analysis.
E.g. $\frac{dy}{dx} = x \\ dy = x dx\\ \int dy = \int x dx\\ y = \frac{1}{2} x^2$
Is there a way to justify the manipulations according to Leibniz-notation without nonstandard-analysis.
E.g. $\frac{dy}{dx} = x \\ dy = x dx\\ \int dy = \int x dx\\ y = \frac{1}{2} x^2$
Since the OP included the tag nonstandard-analysis with his original question, it may be fair to point out that it is not quite accurate that the expression $\frac{dy}{dx}$ is just "notation". Indeed, Leibniz's view point was that $\frac{dy}{dx}$ is a ratio of infinitesimal differentials. Similarly, in the hyperreal framework it can be viewed as a ratio of infinitesimals, as explained in Keisler's textbook Elementary Calculus.
As far as the equation $\frac{dy}{dx}=x$ is concerned, this is not really necessary because a simple integration does the trick, but for example for $\frac{dy}{dx}=y$ we get a nontrivial example of separation of variables where a hyperreal framework can be useful in justifying the procedures usually followed in this approach.