Diagonal elements of correlation matrix are $1$

1.9k Views Asked by At

I am not good in statistics so just ask a fundamental question:

  1. $X\in \mathbf{R}^n$: a random vector.
  2. $K_X$: covariance matrix
  3. $R_X$: correlation matrix
  4. $m_X = \operatorname{E}[X]$

We know that

  1. $K_X = \operatorname{E}[(X - \operatorname{E}[X])(X - \operatorname{E}[X])^T]$
  2. $K_X=R_X-m_Xm_X^T$.

enter image description here

From the Wiki:

https://en.wikipedia.org/wiki/Covariance_matrix#Correlation_matrix

Elements of the diagonal of a correlation matrix are $1$.

How to explain it from the formula $K_X=R_X-m_Xm_X^T$?

I know the diagonal elements of $R_X$ look like $\operatorname{E}[X_i^2], i=1,\ldots, n$; how to prove they are $1$?

1

There are 1 best solutions below

0
On BEST ANSWER

Taking as a starting point the definition from Wikipedia, i.e. that the correlation matrix is given by $(\mathrm{diag}(\Sigma))^{-1/2} \Sigma (\mathrm{diag}(\Sigma))^{-1/2}$, where $\Sigma_{ij} = \mathrm{Cov}(X_i, X_j)$, then there's not really anything special about covariances that cause the diagonal elements to be one.

Indeed if $S$ is any matrix whose diagonal entries are positive, then the diagonal elements of $(\mathrm{diag}(S))^{-1/2} S (\mathrm{diag}(S))^{-1/2}$ will all be $1$. To see this, note that the effect of multiplying $S = (s_{ij})$ from the right by $(\mathrm{diag}(S))^{-1/2}$ corresponds to multiplying all elements in the $i$'th column of $S$ by $s_{ii}^{-1/2}$. Similarly, multiplying from the left correponds to multiplying all elements in the $i$'th row of $S$ by $s_{ii}^{-1/2}$. At the end of the day, this means that the $i$'th diagonal element of $S$, which is $s_{ii}$ will have been multiplied by $s_{ii}^{-1/2}$ twice, so that in the product of interest, the $i$'th diagonal element is $s_{ii}^{-1/2}s_{ii}^{-1/2}s_{ii} = 1$.