After rewriting the definition of derivatives (which is the limit definition), we still treat $\frac{dy}{dx}$ as a fraction, for example, $\int {sin(x)cos(x)dx}$ , we substitute that $cos(x) = \frac{dy}{dx}$ and $sin(x) = y$ , then we cancel $dx$ with $dx$ as it's a fraction.
In fact, after rewriting the definition of derivatives as $\frac{df}{dx}= \grave f(x)= \lim\limits_{h \to 0} \frac{f(x+h)-f(x)}{h}$ , it's no more a fraction , but we still treat it as a fraction like the example above.
So, How can both be correct treating it as a fraction and the limit definition, How can we reconcile between treating it as a fraction and it's not actually a fraction by the limit definition?
It's not a fraction. What we have here is an instance of suggestive notation!
The symbol you're using looks like a fraction, and if you try to treat it like a fraction, you develop naïve results that turn out to also match actual results. This, to me, is a sign that the notation is well-chosen.
What actually happens is an application of the fundamental theorem of calculus. Since $\frac{d}{dx} f(g(x))=\frac{d}{du}f|_{u=g(x)}\cdot\frac{d}{dx} g(x)$ for differentiable functions $f$ and $g$, you can just check that $\int_a^b f(g(x))g'(x)\textrm{d}x=\int_{g(a)}^{g(b)} f(u)\textrm{d}u$. Note that the placeholder variables $x$ and $u$ don't carry inherent meaning.
It just happens that this actual mathematical result is consistent with what would happen if you just did your naïve fractional calculus.