Prove that for any eigenvalue, its eigenspace is in direct sum with the sum of all other eigenspaces of distinct eigenvlaues.

445 Views Asked by At

I am asked to prove that: Assume linear transformation $T:V \rightarrow V$ has $k$ distinct eigenvalues $\lambda_1 ... \lambda_k$, then for any eigenvalue $\lambda_i$ , $E_{\lambda_i} \cap (\sum_{j\ne i} E_{\lambda_j}) = \{\mathbf{0}\}$, where $\sum_{j\ne i} E_{\lambda_j}$ is the sum of the eigenspaces of all distinct eigenvalues other than $\lambda_i$.

My current thought is that assume $\mathbf{v} \in E_{\lambda_i} \cap \sum_{j\ne i} E_{\lambda_j}$, then we have $T(\mathbf{v}) = \lambda_i\mathbf{v} $ as well as $T(\mathbf{v}) = \sum_{j \ne i} \lambda_j\mathbf{v}_j$, since $\mathbf{v} = \sum_{j\ne i} \mathbf{v}_j$, where each $\mathbf{v}_j$ is some eigenvector of $\lambda_i$. Then I can get $\mathbf{0} = -\lambda_i\mathbf{v}+\sum_{j \ne i} \lambda_j\mathbf{v}_j$. However I can't seem to somehow deduct from there that it must be $\mathbf{v}=\mathbf{0}$. I think I should probably use the fact that eigenvectors of distinct eigenvalues are linearly independent but I cannot see how to use it.

3

There are 3 best solutions below

1
On BEST ANSWER

Suppose that

$E_{\lambda_i} \cap \displaystyle \sum_{j \ne i} E_{\lambda_j} \ne \{ \mathbf 0 \}; \tag 1$

then, as pointed out by our OP PsychoCom there is a vector

$\mathbf 0 \ne \mathbf v \in E_{\lambda_i} \cap \displaystyle \sum_{j \ne i} E_{\lambda_j}; \tag 2$

we thus have

$T\mathbf v = \lambda_i \mathbf v; \tag 3$

that is, $\mathbf v$ is an eigenvector corresponding to $\lambda_i$; and, since

$\mathbf v \in \displaystyle \sum_{j \ne i} E_{\lambda_j}, \tag 4$

we have

$\mathbf v = \displaystyle \sum_{j \ne i} \mathbf v_j, \tag 5$

where

$T\mathbf v_j = \lambda_j \mathbf v_j, \; \mathbf v_j \ne \mathbf 0; \tag 6$

that is, each $\mathbf v_j$ is an eigenvector corresponding to $\lambda_j$.

Now, it is a basic and well-known theorem that eigenvectors corresponding to distinct eigevalues are linearly independent; thus no linear relation such as (5) may bind; therefore

$E_{\lambda_i} \cap \displaystyle \sum_{j \ne i} E_{\lambda_j} = \{ \mathbf 0 \}. \tag 7$

1
On

If $\mathbf{v} \ne 0$ and $v \in E_{\lambda_i}$ then $\mathbf{v}$ is an eigenvector of $\lambda_i$. Therefore $\mathbf{v}, \mathbf{v}_j \;\ (j \ge i)$ are eigenvectors corresponding to distinct eigenvalues and therefore they are linearly independent. This means $\mathbf{v} \not\in \sum_{j \ne i}E_{\lambda_j}$.

0
On

If $v\in E_{\lambda_i}\cap\sum_{j\neq i}E_{\lambda_j}$, one can always write $v=\sum_{j\in J}v_j$ where $J$ is a subset of $\{1,\ldots,k\}\setminus\{i\}$, and the $v_j$ are linearly independent eigenvectors with each $v_j\in E_{\lambda_j}$: if you have such an expression but with some linear dependency among the $v_j$, simply write one of those $v_j$ as a linear combination of the others, and regroup vectors for the same eigenvalue into a single eigenvector for that eigenvalue (or zero); this reduces the size of $J$ and one can repeat until linear independence is obtained.

Now, as was mentioned in the question, one has $0=-\lambda_iv+\sum_{j\in J}\lambda_jv_j$, which because of $v=\sum_{j\in J}v_j$ one can write as $0=\sum_{j\in J}(\lambda_j-\lambda_i)v_j$. But now the linear independence of the $v_j$ forces all the coefficients $\lambda_j-\lambda_i$ to be zero, while on the other hand our assumption is that none of them are zero. What gives? The only possibility is that $J=\emptyset$, so that $v=0$, as we wanted to show.