I understand this notation is now a differential operator and this is the limit of a quotient, but Leibniz regarded $\frac {dy}{dx}$ as a quotient. In Leibniz's theory where $\frac {dy}{dx}$ is a quotient, do the terms in chain rule cancel out?
For instance below, there are two instances of $du$. $$ y = f(u), u = u_1 = u_2 = g(x) $$ $$ \frac {dy}{du_2} \Big|_{g(x)} \cdot \frac {du_1}{dx} \Big|_{x} = (f \circ g)'(x) = (f' \circ g)(x) \cdot g'(x) $$
It's my understanding that: $$(du_1 = g(x+h) - g(x)) \ne (du_2 = (g(x) +h) - g(x))$$ becomes the following under the infinitesimal theory Leibniz used: $$\lim_{h \rightarrow 0} {(du_1 = g(x+h) - g(x)) = (du_2 = (g(x) +h) - g(x))}$$ $$du_1 =du_2$$
and the above terms $du_1, du_2$ cancel out, leaving $\frac {dy}{dx}$.
If these terms do not cancel and are unequal, why is $du_1 = du_2 = du$ used in the definition of the chain rule in modern theory? Are there advantages to viewing things like this from infinitesimal theory like Leibniz?
There is no equality using the widely used theory of limits because $\frac {dy}{dx}$ is the limit of a quotient, however with Leibniz's theory of infinitesimals $\frac {dy}{dx}$ is a quotient. Leibniz's infinitesimal theory has logical contradictions and to resolve them the theory of non-standard analysis on the hyperreal numebers can be used. The hyperreals $\mathbb{R}*$ are an extension to the reals with infinitesimals $\mathbb{R} \subseteq \mathbb{R}^*$. The derivative and integral of calculus are expressed with the standard part function which is $s : \mathbb{R}^* \rightarrow \mathbb{R}$.
With non-standard analysis the above equality $du_1 = du_2 = g(x+h) = g(x) + h$ holds if it is proven. For instance $g(x)$ is continuous because it is differentiable. When proven $du$ cancels out. When terms are cancelled out due to equality the following is the result:
$$y = f(x), u = g(x)$$ $$ \frac {dy}{du} \frac {du}{dx} = \frac {f(g(x+h)) - f(g(x)))}{g(x+h)-g(x)} \frac {g(x+h) -g(x)}{h} = \frac {f(g(x+h) -g(x))}{h} $$
due to continuity, $g(x) + h - g(x) = g(x+h) - g(x)$ then the result is: $$ \frac {f(g(x) + h) - f(g(x))} {h} $$
This appears to agree with the result using limits: $$ f'(g(x)) = \lim_{h\rightarrow \infty}{\frac {f(g(x) + h) - f(g(x))} {h}} $$