How do I do a "regression" on a differential equation?

1.7k Views Asked by At

I have a differential equation for a model of uptake and depuration of a contaminant. The model looks like this: $$\frac{dC}{dt}=k_u-(k_d+g)\cdot C(t)$$ I have estimates of the depuration rate $k_d$ and the growth rate $g$, and I have the concentration at times $t=0$ and $t=6$. I would like to use those to estimate the uptake rate $k_u$ and, ideally, find a confidence interval for $k_u$.

So far my best bit is to run a simulation with an estimated $k_u$, see if I end up at the right value, adjust it up or down, see if it helps, and repeat until I get a decent best guess. I could probably automate this iterative process in MatLab. It just feels a bit too home-made.

Any other ideas?

(Note that in this case, a regression is not strictly necessary, as I already estimated the other parameters. It could just be the nonlinear equivalent of connecting two dots with a straight line.)