I have two equations that fit a single dataset i.e., $f_1(x)=ax^2+bx+c$ and $f_2(x)=10^{ax^2+bx+c}$, which I then evaluate the gradient for at a specific point $x_0$. The gradients for both functions have a difference of 1 per cent at most.
Would it thus be correct to assume that two optimized fitting functions (i.e. with a high correlation coefficient) would give the same gradient?