Suppose I have the following system of linear inhomogeneous odes on $\mathbb{R}^{n}$:
$$\dot{x} = Ax + f(t), \qquad x(0)=x_{0}$$
where $x \in \mathbb{R}^{n}$, $f(t) \in \mathbb{R}^{n}$ is a time-dependent "forcing" term, and $A$ is an $n \times n$ matrix. For the integrating factor "$\exp(-tA)$" method to apply, what assumptions do we need on $f(t)$? Would Riemann integrability be necessary and sufficient?
The solution with initial condition $x(0)=x_0\in\Bbb R^n$is $$ x(t)=e^{tA}x_0+e^{tA}\int_0^t e^{-sA}\,f(s)\,ds. $$ For the integral to be defined, it is enough that $f$ be Riemann (or Lebesgue) integrable. But if we want the solution to be differentiable, more regularity is needed. Continuity of $f$ suffices.