I am provided that $X = (X_1, X_2) \sim \mathcal{N}_2(\mu, \Sigma)$, where $$ \mu = \begin{bmatrix} 3 \\ -2 \end{bmatrix} \quad \text{ and } \quad \Sigma = \begin{bmatrix} 2 & -1 \\ -1 & 2 \end{bmatrix} $$ and I am tasked with expressing $X$ as a linear function of two independent standard random variables, but I am not entirely sure how to do so. From the provided information, it is clear that $X_1 \sim \mathcal{N}(3,2)$ and $X_2 \sim \mathcal{N}(-2,2)$ with $\text{Cov}(X_1, X_2) = -1$. My first attempt was to rewrite the characteristic function associated with this random vector to parse out the independent parts: \begin{align*} \phi_X(u) &= e^{i \mu^T u - \frac{1}{2} u^T \Sigma u } \\ &= e^{ i(3u_1 - 2u_2) - u_1^2 + u_1 u_2 - u_2^2 } \\ &= e^{ (3i u_1 - u_1^2) + (-2i u_2 - u_2^2) + u_1 u_2 } \end{align*} However, the term $u_1 u_2$ (representing the covariance between $X_1$ and $X_2$) in this expression causes my method to fail.
I know that I want to define two random variables, say $Y_1$ and $Y_2$, with $\text{Cov}(Y_1, Y_2) = 0$ such that $X = (X_1, X_2) = aY_1 + bY_2$, where $a$ and $b$ are some constants, but I am unsure how to determine these random variables. Is there some way to factor this above characteristic function expression to get the desired result, or should I approach this using a different method?
EDIT: Using the comment from William M., this is what I have so far:
Through eigen-decomposition, we may write $\Sigma = V D V^{-1}$ where $V$ is the matrix of eigenvectors and $D$ is the matrix of eigenvalues. We find that $$ V = \begin{bmatrix} 1 & 1 \\ 1 & -1 \end{bmatrix} \quad \text{ and } \quad D = \begin{bmatrix} 1 & 0 \\ 0 & 3 \end{bmatrix} $$ Then, to express $\Sigma$ as $\Sigma = A A^T$, take $$ A = V \sqrt{D} = \begin{bmatrix} 1 & 1 \\ 1 & -1 \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & \sqrt{3} \end{bmatrix} = \begin{bmatrix} 1 & \sqrt{3} \\ 1 & -\sqrt{3} \end{bmatrix} $$ and so $$ A^{-1} = \begin{bmatrix} \tfrac{1}{2} & \tfrac{1}{2} \\ \tfrac{1}{2\sqrt{3}} & -\tfrac{1}{2\sqrt{3}} \end{bmatrix} $$ Then, $$ Y = \begin{bmatrix} Y_1 \\ Y_2 \end{bmatrix} = A^{-1} (X - \mu) = \begin{bmatrix} \tfrac{1}{2} & \tfrac{1}{2} \\ \tfrac{1}{2\sqrt{3}} & -\tfrac{1}{2\sqrt{3}} \end{bmatrix} \begin{bmatrix} X_1 - 3 \\ X_2 + 2 \end{bmatrix} = \begin{bmatrix} \left( \frac{X_1 + X_2 - 1}{2} \right) \\ \left( \frac{X_1 - X_2 - 5}{2 \sqrt{3}} \right) \end{bmatrix} $$ has zero mean and the identity matrix $I$ as its dispersion matrix. Finally, we may represent $X$ as $$ X = AY + \mu $$ as desired.