First the question I'm struggling with:
Consider the stationary Gaussian process $y:\Omega \times T \to \mathbb{R}^p$, with $y(t) \in G(0,Q)$ and variance matrix $Q\in\mathbb{R}^{p\times p}$, $Q=Q^T\geq 0$. Determine a linear transformation of the process, say with matrix $S\in\mathbb{R}^{p\times p}$, such that the transformed process $x(t) = S y(t)$ is such that for any time moment $t \in T$ its components are independent: $x_i(t)$ and $x_j(t)$ are independent for $i,j\in\{1,2,\ldots,p\}$, with $i \neq j$ and for $t\in T$.
I'm from an engineering background, so the formal mathematics of this course always makes it hard to see through the formulations. In my mind, I'm searching for a matrix $S$ such that:
$$\begin{bmatrix} x_1(t)\\ x_2(t)\\ \vdots \\ x_p(t) \end{bmatrix} = S \begin{bmatrix} y_1(t)\\ y_2(t)\\ \vdots \\ y_p(t) \end{bmatrix}$$ Where I should somehow show that the covariance of $(x_i(t),x_j(t))$ should be $0$, $ \mathbf{E}\left[\left(x_i(t)-\mathbf{E}[x_i(t)]\right)\left(x_j(t)-\mathbf{E}[x_j(t)]\right)^T\right]=0$. Or am I thinking of the wrong approach? (if not, how could I determine such a matrix?)
Where $ \mathbf{E}\left[\left(x_i(t)-\mathbf{E}[x_i(t)]\right)\left(x_j(t)-\mathbf{E}[x_j(t)]\right)^T\right]=$ $ \mathbf{E}\left[x_i(t)x_j(t)^T\right]=$ $ \mathbf{E}\left[\left(S_{i,\bullet}y(t)\right)\left(S_{j,\bullet}y(t)\right)^T\right]=S_{i,\bullet}\mathbf{E}\left[y(t)y(t)^T\right]S_{j,\bullet}^T = 0 =S_{i,\bullet}QS_{j,\bullet}^T $. So by definition the covariance between two components of $x(t)$ is given by the above equation right? This means that $S$ should be set up in such a way that the $i$-th row of $S$ times the covariance of $y(t)$ times the $j$-th row of $S$ transposed should be $0$? ($\forall i,j\in\{1,2,\ldots,p\}$, with $i\neq j$). Which would mean an orthogonal matrix $S$, but how to deal with the $Q$ in the middle of the multiplication?
You're right, it involves an orthogonal matrix.
The components of a gaussian vector are independent if and only if its variance matrix is diagonal. As $Q$ is symmetric and $Q\geq 0$, we can diagonalize it in an orthonormal basis. Hence there exists an orthogonal matrix $P$ such that $Q = PDP^T$ and $D$ is a diagonal matrix with non-negative diagonal entries.
We also know that if $\Sigma$ is the variance matrix of a random vector $Y$ and $A$ a constant (non-random) matrix then the variance matrix of $AY$ is $A\Sigma A^T$.
Thus, $Var(P^TY) = P^TQP = D$. The variance matrix of of $P^TY$ is diagonal so its components are independent. So we can choose $S=P^T$.