The problem: Consider constant matrices $A,R\in\mathbb{R}^{n\times n}$. I want to solve: $$ \frac{dP(t)}{dt} = AP(t)+P(t)A^T+R $$ for $P(t)\in\mathbb{R}^{n\times n}$.
My attempt: note that the differential equation is linear, so up to know I have been able to write $P(t)$ as vector $p(t)\in\mathbb{R}^{n^2}$ using the vectorization operator $p(t):=\text{vec}(P(t))$ from wikipedia which comply: $$ \text{vec}(AB) = (I\otimes A)\text{vec}(B) = (B^T\otimes I)\text{vec}(A) $$ hence, $$ \begin{aligned} \text{vec}\left(\frac{dP(t)}{dt}\right) &= \frac{dp(t)}{dt}= \text{vec}(AP(t)) + \text{vec}(P(t)A^T) + \text{vec}(R)\\ &=(I\otimes A)p(t) + (A\otimes I)p(t)+r\\ &=\tilde{A}p(t) + r \end{aligned} $$ where $\tilde{A} = (I\otimes A) + (A\otimes I)$ and $r=\text{vec}(R)$. Then, $$ p(t) = \exp(\tilde{A}t)p(0) + \int_0^t\exp(\tilde{A}(t-\tau))rd\tau $$ However, I still haven't managed to convert $p(t)$ to $P(t)$ without making a total mess with the components of the matrices of the form $ \exp(\tilde{A}t)$. I have only been able to "revert" the vectorization operation "by hand", so that I don't see the pattern in the form of $P(t)$ I obtain. Moreover, I have only been able to do this for a particular value of $n$ ($n=2, n=3$) and I don't see the patter for general $n$.
Is there a way to obtain a more compact closed form solution for $P(t)$ (converting $p(t)$ to $P(t)$ for the general case)?
Moreover, my procedure may be totally wrong. In that case, can you suggest the correct path?
EDIT:
I found this paper here where they obtain: $$ P(t) = \exp(At)P(0)\exp(A^Tt) + \int_0^t\exp(A\tau)R\exp(A^T\tau)d\tau $$ which is easily verified that it complies the differential equation. However, I still have no clue on how one can derive that solution.
Your solution seems to be correct.
Let me outline another way to write it without the $\operatorname{vec}$ operation (which is basically equivalent here, but might work better in some cases) : Let $L(M) = AM + M A^T$ for any $M\in \mathbb R^{n\times n}$. This defines a linear operator $L$ and the equations can be rewritten : $$\dot{P} = L(P) + R$$
This linear equation can be solved using the exponential of $L$ : $$P(t) = e^{tL}(P(0)) + \int_0^te^{(t-s)L}(R)\text ds\hspace{1.4cm}(*)$$
Without a nice expression for $L^n$ or $\exp(tL)$ becoming apparent, the only way I see to push forward the calculation is try and diagonalize $L$. It is not clear that this is possible in general, but if $A$ is a normal matrix (ie it commutes with $A^T$), then $L$ is also normal (with respect to the scalar product on $\mathbb R^{n\times m}$ given by $(M,N) = \text{Tr}(M^TN)$). The spectral theorem then ensures that $L$ is diagonalizable. By expanding $P(0)$ and $R$ over the eigenbasis, you can compute $P(t)$ explicitely.
Edit
Actually, we can derive an analytical solution. Let $\rho_{B}(M) = MB$ and $\lambda_B(M) = BM$ (for right and left multiplication). Then, we have :
$$L = \lambda_{A} + \rho_{A^T}$$ $$\lambda_A \rho_{A^T}(M) = AMA^{T} = \rho_{A^T}\lambda_A(M)$$
Since $\lambda_{A}$ and $\rho_{A^T}$ we have : $$\exp(tL)= \exp(t\lambda_A)\exp(t\rho_{A^T})$$
Finally, we notice that $\exp(t\lambda_A) = \lambda_{\exp(tA)}$ and $\exp(t\rho_{A^T}) = \rho_{\exp(tA^T)}$. Inserting this in $(*)$, we get : $$P(t) = e^{tA}P(0)e^{tA^{T}} + \int_0^t e^{(t-s)A} R e^{(t-s)A^T} \;\text ds$$