Separation of timescales - what happens to the slow variable?

50 Views Asked by At

Consider an ODE

$$ \frac{d}{dt} x_t = \frac{1}{\tau_x}f(x_t,y_t,I_t)$$ $$ \frac{d}{dt} y_t = \frac{1}{\tau_y}g(x_t,y_t,I_t)$$

where $I_t$ denotes input to the system, and we assume for the time constants that $\tau_y>>\tau_x$, that is $x$ changes a lot faster than $y$. Suppose we start at a point of equilibrium $x_0,y_0,I_0$, where $I_0=0$, and then change the input, where we suppose that the input is approximately constant for small time intervalls.

It is a common assumption in applied situations that one can then separate the timescales, that is one can assume that, since $x$ changes faster, one can study the dynamics $$ \frac{d}{dt} \tilde{x}_t = \frac{1}{\tau_x}f(\tilde{x}_t,y_0,\tilde{I_t}),$$ with $\tilde{I}_t=c.$

Now I wonder what the equivalent/dual for this heuristic is for the slow variable $y$.

That is, if we now consider a somewhat longer time interval, not so short that only the fast variables will change, but not incredibly long either, then how will $y$ change. Intuitively, I would suppose that one could approximate the slow variables by somehow taking the mean.

That is, if we denote $\tilde{x}_t^{*}(y_0)$ a fixed point of the dynamics of $\tilde{x}$, given $y_0$, then I would suppose that, at least in terms of the equilibrium, one could approximate

$$ \frac{d}{dt} \tilde{y}_t = \frac{1}{t} \int_0^t g(\tilde{x}_s^{*}(y_0),\tilde{y}_t,\tilde{I}_s) ds \approx \frac{1}{n} \sum_{s=1}^{n} g(\tilde{x}_s^{*}(y_0),\tilde{y}_t,\tilde{I}_s).$$

Can one justify this type of approximation? Or another, similar one? Or am I completely mistaken here?