I have been working on a problem in Quantum Mechanics and I have encountered a equation as given below.
$$\frac{d\hat A(t)}{dt} = \hat F(t)\hat A(t)$$
Where ^ denotes it is an operator
How will this differential equation be solved? Will the usual rules for linear homogeneous first order differential with variable coefficients apply here?
You can solve it by iteration (assuming convergence). Assuming that you are interested in the solution with the initial condition $\hat A(0)= I$, the iterative solution reads $$\hat A(t) = I +\int_0^t\hat F(t_1)\,dt_1 + \int_0^t\int_0^{t_1}\hat F(t_1) \hat F(t_2)\,dt_1\,dt_2 + \cdots \tag{1}$$
For convenience, one might introduce the concept of the ordered exponential. With that the solution assumes the compact form $$\hat A(t) = \mathcal{T} \left\{\exp\left[ \int_0^t \hat F(t')\,dt'\right] \right\}$$ where $\mathcal{T}$ indicates that when expanding the exponential, the $\hat F$ in the individual terms should be ordered according to their time argument (and thus reproducing (1)).