If additional information of slopes are available in addition to observations; Can this information be used to improve fitting? Following could be considered as an example:
Suppose I have $\mathbf{x}$ and corresponding $\mathbf{y}$ observations; and I need fit and estimate the parameters of the model of form $y = a \mathrm{sin}(bx)+c$; where a, b, c are parameters of model.
Now, somehow I also know derivative values very precisely at $\mathbf{x'}$; where $\mathbf{x'}$ may or may not be equal to $\mathbf{x}$.
Can I improve my parameter estimates using derivative information? (My intution says YES). If yes how?



This is a very interesting post about regression where you also have accurate values for the derivatives at some points.
After JJacquelin's answer, let me suppose that you have $n$ data points$(x_i,y_i)$ and $m$ data points$(x_i,y'_i)$. So, let us consider the models $$y=a+b\sin(\omega x)+ c\cos(\omega x)$$ $$y'=b\omega\cos(\omega x)- c\omega\sin(\omega x)$$ and the sums of squared errors $$S_1=\sum_{i=1}^n\left(a+b\sin(\omega x_i)+ c\cos(\omega x_i)-y_i\right)^2$$ $$S_2=\sum_{j=1}^m\left(b\omega\cos(\omega x_j)- c\omega\cos(\omega x_j)-y'_j\right)^2$$ Then, what you can do is to minimize with respect to the $(a,b,c,\omega)$ parameters the global function $$\Phi=S_1+k S_2$$ where $k$ is a weighting factor you assign to the data (it will say for example that the derivatives are ten times more accurate that the function values).
Since JJacquelin's simple procedure already gives more than reasonable estimates for the parameters, you could start from these using an optimizer or even use a Newton-Raphson method to get final values.