Eigenvalues and eigenvectors in a special matrix

1k Views Asked by At

Let $\Sigma$ be the the covariance matrix for some correlation coefficient $\rho>0$. $$\Sigma=\begin{bmatrix}1 & \rho &\rho&\rho&\rho\\\rho&1&\rho&\rho&\rho\\\rho&\rho&1&\rho&\rho\\\rho&\rho&\rho&1&\rho\\\rho&\rho&\rho&\rho&1\end{bmatrix}$$ Find the eigenvalues and eigenvectors of this covariance matrix.

I'm looking for some trick to find the eigenvalues and eigenvectors, because if I do $$det(\Sigma-\lambda I)=0$$ it will be very complicated to find the eigenvalues.

Any property that I can use in this case?

2

There are 2 best solutions below

0
On BEST ANSWER

You can see in it the usual matrix:

$$\begin{bmatrix}\rho & \rho &\rho&\rho&\rho\\\rho&\rho&\rho&\rho&\rho\\\rho&\rho&\rho&\rho&\rho\\\rho&\rho&\rho&\rho&\rho\\\rho&\rho&\rho&\rho&\rho\end{bmatrix}=\rho J$$

where $J$ is the matrix full of 1. This matrix $J$ is an interesting one: what can be said of $J^2$? By this remarks you can find the reduction of $J$, in particular its eigenvalues and eigenvectors.

Then, what is the relations between $\rho J$ and your initial matrix? You see that:

$$\Sigma : \rho J + (1-\rho)I_n$$

and since $(1-\rho)I_n$ is a scalar matrix, it is the same in every base, so finding a base of diagonalization of $J$ is enough to gives you the complete reduction of $\Sigma$.

0
On

Clearly, $\sum=\rho J_n+(1-\rho)I_n$. The eigenvalues of $J_n$ are $n$ (with multiplicity $1$), $0$ (with multiplicity $n-1$). The eigenvectors corresponding to $n$ of $J_n$ is $\mathbf{1}_n$, the vector of all ones. The rest all eigenvectors of $J_n$ are orthogonal to $\mathbf{1}_n$. So one can choose them as the vectors $\varepsilon_i$, $i=2, \ldots, n$, where $\varepsilon_i(j)=\begin{cases}1, &\text{ if }j=1\\-1 & \text{ if }j=i,\\0 & \text{ otherwise.}\end{cases}$

Now, as the eigenvectors of $\sum$ are also the eigenvectors of $J_n$, the eigenvalues of $\sum$ are $\rho n+(1-\rho)=\rho(n-1)+1$ (with multiplicity $1$) and $1-\rho$ (with multiplicity $n-1$).