How to identify linear non-homogenous ODE from data?

47 Views Asked by At

I want to fit a model $$\overset{\cdot} x = Ax + v(t)Bx + Cu(t)$$

to data, where $u(t),v(t)$ are known inputs and $A,B,C$ should be fitted. The data are assumed to be drawn from the above model but with an $i.i.d.$ noise, that is for the data $y_t$ we would have $$y_t = x(t) + \eta$$ $$\eta \sim \mathcal{N}(0,\sigma^2)$$

Now I want to use linear regression to fit the model to the above data. Is it an appropiate approach to use $$ \overset{\cdot}{y_t} \approx \frac{y_{t+1}-y_{t}}{\delta t} =: z_t$$

and then use linear regression for $$z_t | y_t = Ay_t + v_tBy_t + Cu_t + \eta_z$$

with $\eta_z \sim \mathcal{N}(0, \frac{2 \sigma^2}{(\delta t)^2})$?

Or should I not make this approximation? (In the simple 1-d case of no input one could e.g. fit an equation of the form $\ln y_t = at + \ln x0$, i.e. using the explicit solution, but I don't see how to generalize this)

Any help would be appreciated, also I would be glad for some link where exactly this special case is treated. (Most ressources I find are rather detailed info on system identification, which I think is too much for this simple case)

1

There are 1 best solutions below

3
On

As

$$ x(t) = C\exp \left(\int _0^t(A+B v(\eta ))d\eta \right)\int _0^t\exp \left(-\int _0^{\mu }(A+B v(\eta ))d\eta \right) u(\mu )d\mu +c_0 \exp \left(\int _0^t(A+B v(\eta ))d\eta \right) $$

given the data points $\{t_k,x_k,k=1,\cdots,n\}$ we can form the error function

$$ \sum_k\|x_k-x(t_k)\|^2 = \mathcal{E}(A,B,C.c_0) $$

and then proceed with a minimization method to find the minimum for $\mathcal{E}(A,B,C.c_0)$.