Proving that two r.v. are uncorrelated

215 Views Asked by At

suppose we have a random vector $x=(x_1,\dots,x_p)$ with mean $m$ and covariance matrix $\Sigma$. Now we decompose $\Sigma$ using spectral decomposition as $$ \Sigma = U D U^T $$ where $D=diag(\lambda_1,\dots,\lambda_p)$, a diagonal matrix of eigenvalues and the columns of $U$ are the the corresponding eigenvectors $v_i$.

Now I define $y_i=v_i^T(x_i-m)$, or more compactly: $Y=U^T(x-m)$ and I want to verify the followings things:

  1. $E[y_i]=0$
  2. $Var(y_i)=\lambda_i$
  3. $Cov(y_i,y_j)=0$ for $i\not= j$
  4. $Var(y_1)\ge Var(y_2)\ge \dots \ge Var(y_p)$

I was able to prove the first one. I think one can directly start with the third one and to establish 2, too: $$ Cov(y_i,y_j)=E[y_iy_j]=E[(\sum_{l=1}^pv_{li}(x_l-m_l))(\sum_{k=1}^pv_{kj}(x_k-m_k))] $$ How can I further simplify this expression? About the fourth one, I think this is a result from linear algebra, but I can't find the right one, a reference would be appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

You have this matrix formulas (Here $Y$ is a random (column) vector, $A$ is known constant matrix): $$ \text{E} AY = A \text{E}Y $$ $$ \text{Var} AY = A \text{Var}(Y) A^{T}. $$ The first one gives your result 1). For the second one, calculate $$ \text{Var} U^T (X-m) = U^T \text{Var}(X-m) U = U^T \Sigma U = U^T U D U^T U = D $$ (since $U U^T = U^T U = I$).
For your third result, that follows now simply from above, since the variance-covariance matrix was shown to be diagonal.

The last one follows, if, as is usual, you assume that the eigenvalues on the diagonal of $D$ is given in descending order. That is a convention, not a result, since, in the product $U D U^T$, if you permute the columns of $U$ and of $D$ the same way, the product is unchanged.