Covariance matrix of Y when we have the covariance matrix of X

743 Views Asked by At

If the random vector $\mathbf{X}$ is transformed according to \begin{align*} Y_1 &= X_1\\ Y_2 &= X_1 + X_2 \end{align*} and has a covariance matrix $$ \mathbf{C}_X = \begin{bmatrix} 2 & 1\\ 1 & 2 \end{bmatrix} $$ find the covariance matrix for $\mathbf{Y} = [Y_1 Y_2]^{\intercal}$.


My Solution

Let $\mathbf{Y} = \mathbf{A}\mathbf{X}$. Then by definition, $\mathbf{C}_Y = \mathbf{V}^{\intercal}\mathbf{C}_X\mathbf{V} = \Lambda$ where $\mathbf{A} = \mathbf{V}^{\intercal}$, $\mathbf{V}$ is the matrix of eigenvectors for the eigenvalues $\lambda_i$ for $\mathbf{C}_x$, and $\Lambda$ is a diagonal matrix. First, we need to find the eignvalues and eigenvectors for $\mathbf{C}_X$. The characteristic equation for a $2\times 2$ matrix is $$ p(\lambda) = \lambda^2 - \lambda\text{tr}(\mathbf{M}) + \det(\mathbf{M}) $$ where $\mathbf{M}$ is the matrix of interest and tr is the trace of the matrix. $$ p(\lambda) = \lambda^2 - 4\lambda + 3 = (\lambda - 3)(\lambda - 1) = 0 $$ Therefore, the eigenvalues are $\lambda = 1, 3$. The eigenvector for $\lambda = 1$ can be found by $$ \begin{bmatrix} 1 & 1\\ 1 & 1 \end{bmatrix} \Rightarrow \begin{bmatrix} 1 & 1\\ 0 & 0 \end{bmatrix} $$ This tells us that $x_1 = -x_2$ and $x_2$ is a free variable. In other words, $$ \mathbf{v}_{\lambda = 1} = \begin{bmatrix} -x_2\\ x_2 \end{bmatrix} = x_2 \begin{bmatrix} -1\\ 1 \end{bmatrix} $$ For $\lambda = 3$, we have $$ \mathbf{v}_{\lambda = 3} = x_2 \begin{bmatrix} 1\\ 1 \end{bmatrix} $$ Finally, we have that our matrix $\mathbf{V}$ is $$ \mathbf{V} = \begin{bmatrix} -1 & 1\\ 1 & 1 \end{bmatrix} $$ so $$ \mathbf{C}_Y = \mathbf{V}^{\intercal}\mathbf{C}_x\mathbf{V} = \begin{bmatrix} -1 & 1\\ 1 & 1 \end{bmatrix} \begin{bmatrix} 2 & 1\\ 1 & 2 \end{bmatrix} \begin{bmatrix} -1 & 1\\ 1 & 1 \end{bmatrix} = \begin{bmatrix} 2 & 0\\ 0 & 6 \end{bmatrix}. $$


The book's answers

The book's solution is $$ \begin{bmatrix} 2 & 3\\ 3 & 6 \end{bmatrix}. $$ Why is this? By definition, the covariance matrix of $Y$ is diagonal.

1

There are 1 best solutions below

2
On BEST ANSWER

The entries of the covariance matrix of $\vec{Y}$ are $$C^Y_{ij}=Cov(Y_i, Y_j)$$ for $i=1,2$. Moreover the matrix is symmetric since $$C^Y_{ij}=Cov(Y_i, Y_j)=Cov(Y_j,Y_i)=C^Y_{ji}$$ Thus $$\begin{align*}C^Y_{11}&=Cov(Y_1,Y_1)=Var(Y_1)=Var(X_1)=Cov(X_1,X_1)=C^{X}_{11}=2\\C^Y_{12}&=Cov(Y_1,Y_2)=Cov(X_1,X_1+X_2)=Cov(X_1,X_1)+Cov(X_1,X_2)=\\&=C^{X}_{11}+C^{X}_{12}=2+1=3\\C^Y_{21}&=C^Y_{12}=3 \\C^Y_{22}&=Cov(Y_2,Y_2)=Var(Y_2)=Var(X_1+X_2)=Var(X_1)+Var(X_2)+2Cov(X_1,X_2)=\\&=C^{X}_{11}+C^{X}_{22}+2C^{X}_{12}=2+2+2\cdot1=6\end{align*}$$ Hence $$C^Y=\begin{bmatrix} 2&3\\3&6 \end{bmatrix}$$