Sometimes in physics they do things like this one:
If $dq=f\left(x\right)\cdot dr$ then $\frac{dq}{dt}=f\left(x\right)\cdot \frac{dr}{dt}$ Which mathematically is a wrong deduction. Is there any way to justify the required steps to make it a rigorous deduction? -for example using the inverse function theorem or so-
Thanks a lot pals, Karan
I would say the first is an abuse of notation, while the second is well defined if the variables have the right properties. You can make it rigorous by expanding everything in Taylor series. The first one would be seen as a statement about the first order terms in the Taylor series and a claim that the second order terms become negligible in the limit $\Delta t \to 0$. The second takes advantage of all that. Your functions have to be "nice" for all this to work, differentiable of course, but probably uniformly continuous or Lipschitz. Physics functions tend to be so because if the derivatives get too big that represents infinite energy or force or something.