Given the following differential equation: $$ \dot{y}_1 + p(t)f(y_1) = g(t) $$ then I want to linearize it in the sense that it becomes a system of equations on the form: $$ \dot{y}_1 + p_1(t)y_1(t) = g_1(t) $$ $$ \dot{y}_2 + p_2(t)y_2(t) = g_2(t) $$ $$ \vdots $$ $$ \dot{y}_n + p_n(t)y_n(t) = g_n(t) $$ where I allow for any finite number of additional linear differential equations, also the functions $g_i(t)$ and $p_i(t)$ may depend on any $y_j$ where $i\neq j$
Is this possible? preferably exact (of course) or at least as an approximation where it converges towards the exact case for $n\to \infty$ Any ideas or references are highly appreciated.
Warning: this answer may contain traces of engineering language and therefore lack of rigor.
One way to do this would be to split the functions into representations which are valid at different intervals where we assume the function to be nice in each local neighbourhood. Then we require the function values at the boundaries between these sets to be smooth. We could for instance be using polynomials windowed with a function from the exponential family.
Easiest example would probably be a gaussian which windows a set of polynomials. They are easy to differentiate so we can easily get linear constraints on the polynomial coefficients.