Let $X$ be a random vector in $\mathbb{R}^n$ with covariance matrix $\Sigma$, and $T$ be a $n \times n$ matrix.
Then $Y=TX$ is a random vector with covariance matrix $\hat{\Sigma} = T \Sigma T^\top$.
With known $\Sigma$ and $\hat{\Sigma}$ how to calculate $T$? Generally $\Sigma$ will not be diagonal.
There are closely related question (1) which suggest that Cholesky decomposition might be helpful, but I don't see how.
[UPDATE]
I found the answer if $\Sigma$ and $\hat{\Sigma}$ are positive definite. How to get the solution if this is not the case?
In the meantime I found this question which which is quite similar.
Given the Cholesky decomposition of $\Sigma$ and $\hat{\Sigma}$ as
$$\hat{\Sigma} = MM^T$$ $$\Sigma = LL^T$$
then we can follow that $M^{-1}TLL^TT^T[M^T]^{-1} = I$, so $M^{-1}TL = Q$ with an orthogonal Matrix $Q$ and thus
$$T=MQL^{-1}$$ is a solution for any orthogonal $Q$.
Validating this for an example with $Q=I$ and $$ \Sigma = \left( \begin{matrix} 1 & -0.5 & 0 & 0 \\ -0.5 & 1 & 0 & 0 \\ 0 & 0 & 1 & -0.5 \\ 0 & 0 & -0.5 & 1 \end{matrix} \right) $$ $$\hat{\Sigma} = \left( \begin{matrix} 1 & 0.5 & 0.2 & 0.1 \\ 0.5 & 1 & 0.5 & 0.2 \\ 0.2 & 0.5 & 1 & 0.5 \\ 0.1 & 0.2 & 0.5 & 1 \end{matrix} \right)$$
From that we get a lower triangle Matrix for T:
$$ T = \left( \begin{matrix} 1 & 0 & 0 & 0 \\ 1 & 1 & 0 & 0 \\ 0.47 & 0.53 & 0.86 & 0 \\ 0.2 & 0.2 & 0.96 & 1 \end{matrix} \right) $$
With the desired property $\hat{\Sigma} = T \Sigma T^\top$.
BUT for using Cholesky it is neccessary that $\hat{\Sigma}$ and $\Sigma$ are positive definite. So it will not work for e.g.
$$ \Sigma = \left( \begin{matrix} 1 & -0.5 & 0 & -1 \\ -0.5 & 1 & 0 & 0 \\ 0 & 0 & 1 & -0.5 \\ -1 & 0 & -0.5 & 1 \end{matrix} \right) $$