Calculating the covariance matrix

310 Views Asked by At

I'm working through exercises in my textbook and I'm stuck on calculating the covariance matrix.

We are given a random vector $$X = \pmatrix{X_1\cr X_2\cr}$$ with expected value $$\mu = \pmatrix{1\cr-1 \cr}$$ and covariance matrix $$\Sigma = \pmatrix{1 & 0 \cr 0 & 4 \cr}$$ We then have to calculate the $\mu$ and $\Sigma$ of $$ Z = \pmatrix{Z_1\cr Z_2\cr} = \pmatrix{X_1 + X_2\cr X_1 - X_2\cr}. $$

and determine if $Z_1$ and $Z_2$ are independent.

I think that

$$ \mu = E(Z) = \pmatrix{E(Z_1)\cr E(Z_2)\cr} = \pmatrix{E(X_1) + E(X_2)\cr E(X_1) - E(X_2)\cr} = \pmatrix{0\cr 2\cr}, $$

but I'm not sure how to calculate $\Sigma$.

Also, the second part of the question is pretty vague to me as well and goes as follows:

Now take the same $X$, but with $$\mu = \pmatrix{0\cr 0\cr}$$ and $$\Sigma = \pmatrix{1 & 0 \cr 0 & c \cr} \textrm{ for } c > 0$$ For which values $c$ are the components $Z_1$ and $Z_2$ uncorrelated?

I don't really understand how I should start with that question.

1

There are 1 best solutions below

5
On BEST ANSWER

Hints (by no means complete solutions): by definition, $$\boldsymbol{\Sigma}_{Z} = \text{Var}(Z) = \mathbb{E}\left[(Z-\mu_{Z})(Z-\mu_{Z})^{T} \right]\text{.}$$

(This is just the covariance matrix of $Z$. Some people use $\text{Cov}(X)$ instead.)

Independence can't be inferred without knowing the joint distribution and the marginal distributions of $Z_1$ and $Z_2$. (Why?)

What do we mean when we say $Z_1$ and $Z_2$ are uncorrelated?