How to solve matrix-valued differential equations of type $$A'(t)=A(t)B(t) \tag 1$$
All the given functions are square matrices of dimension $3$ and only $A(t)$ is invertible (not $B(t)$ or $A'(t)$). $A(t)$ is a rotation matrix. We know $A(0)$. $A(t)$ is unknown. $B(t)$ is available in explicit form.
NB :: Please provide any suggestions or useful links about solutions
Assuming $B(t)$ is, say, continuous, we can define a matrix $C(t)$ by $$ C(t):=\int_0^t B(s) \, ds. $$ If $C(t)B(t)=B(t)C(t)$ for all $t\geq 0$, then we can put the differential equation in the equivalent form (this is basically just the technique of multiplying by an integrating factor) $$ \frac{d}{dt} A(t)e^{-C(t)}=0, $$ which has the solution $A(t)=A(0)e^{C(t)}$.
Following the suggestion of jnez71, the non-commutative case also has a satisfying solution. Suppose for simplicity that $B:[0,1]\rightarrow \mathbb{R}^{n^2}$ is continuous. Denote by $\mathcal X$ the Banach space of continuous functions $f:[0,1]\rightarrow \mathbb{R}^{n^2}$ with norm $\lVert f \rVert = \sup_{0\leqslant t \leqslant 1} \lVert f(t) \rVert_{\text{op}}$. Let $B(\mathcal X)$ denote the Banach algebra of bounded linear operators on $\mathcal X$. We will look for a solution $A\in \mathcal{X}$.
Define $T_B\in B(\mathcal X)$ by $$ (T_B f)(t) := \int_0^t f(x) B(x) \, dx. $$ Then the equation $A'(t)=A(t)B(t)$ with initial condition $A(0)=1$ may equivalently be written as \begin{equation} \tag{1} A = 1 + T_B A. \end{equation} Note that, for $m\geqslant 2$, $$ (T_B^{m+1} f)(t) = \int ^t _0 \int ^{x_{m+1}}_0 \cdots \int ^{x_2}_0 f(x_{1}) B(x_{1})B(x_2)\cdots B(x_{m+1}) \, dx_1 dx_2 \cdots dx_{m+1}. $$ We therefore find $$ \tag{2} \lVert T_B^{m+1}\rVert \leqslant \frac{\lVert B \rVert ^{m+1}}{(m+1)!} \overset{m\rightarrow\infty}{\rightarrow} 0. $$ Thus, the Neumann series $\sum_{j=0}^\infty T_B^j = (1-T_B)^{-1}$ is well defined.
Uniqueness of A: By repeated application of formula (1), we find $A = \sum_{j=0}^m T_B^j 1 + T_B^{m+1} A$. According to relation (2), we therefore have $A = \sum_{j=0}^\infty T_B^j 1$.
Existence: It is straightforward to verify that $A := (1-T_B)^{-1} 1$ satisfies formula (1).
Final comments: We have thus found the formula \begin{align*} A(t) &= 1 + \int_{0}^t B(x_1) \, dx_1 + \int ^t _0 \int ^{x_2}_0 B(x_{1}) B(x_{2}) \, dx_1 dx_{2}\\ & \qquad + \sum_{j=3}^\infty \int ^t _0 \int^{x_j}_0 \cdots \int ^{x_2}_0 B(x_{1})B(x_2) \cdots B(x_{j}) \, dx_1 dx_2 \cdots dx_{j}. \end{align*} Except for the ordering of the matrices, this is the Peano-Baker series. In the publication linked to by jnez71, 'Product Integration, Its History and Applications' by Slavík, this formula appears as a formula for the right product integral. If $A$ is unitary and we make the replacement $B \rightarrow -i\tilde B$, the formula is known to practitioners of quantum mechanics as the Dyson series.
Slavík, Antonín, Product integration, its history and applications, Dějiny Matematiky / History of Mathematics 29; Jindřich Nečas Center for Mathematical Modeling Lecture Notes 1. Prague: Matfyzpress (ISBN 80-7378-006-2). 147 p. (2007). ZBL1216.28001.