I am trying to find an approximate solution to the following integro-differential equation for the $n$-dimensional vector $\mathbf{x}(t)$ in some interval $t\in[t_0, t_1]$:
$$
\frac{d\mathbf{x}(t)}{dt} = A(t)\mathbf{x}(t) + A(t)\int_{t_0}^t G(t;s)B(s)\mathbf{x}(s)\,ds
$$
with initial condition $\mathbf{x}(t_0) = \mathbf{x}_0$. $A(t)$ and $B(t)$ are $n\times n$ matrices, the elements of which are of similar magnitude, and $G(t;s)$ is the $n\times n$ matrix such that
$$
\frac{dG(t;s)}{dt} = B(t)G(t), \quad G(s;s) = \mathbb{I}
$$
In particular, what I would like to do is to derive an approximate equation for $\mathbf{x}(t)$ by somehow reducing the integral in the second term on the right-hand side via an expansion in $A$ and $B$, so that
$$
\frac{d\mathbf{x}(t)}{dt}=A(t)\mathbf{x}(t)+ M(t)\mathbf{x}(t)
$$
where $M(t)$ effectively includes the effects of the integral.
The $G(t;s)$ can be written as an expansion in $B(t)$ via the Peano-Baker series, but I am not sure how to reduce the dependence on earlier times in the integral. Maybe the Taylor series $\mathbf{x}(s)=\mathbf{x}(t) + (s-t)\left.\frac{d\mathbf{x}}{dt}\right|_t+\ldots$ could be used, but I am not sure whether this would be a rigorous method. What approach is usually used in such cases?
2026-02-24 18:54:23.1771959263