I am asked to prove that: Assume linear transformation $T:V \rightarrow V$ has $k$ distinct eigenvalues $\lambda_1 ... \lambda_k$, then for any eigenvalue $\lambda_i$ , $E_{\lambda_i} \cap (\sum_{j\ne i} E_{\lambda_j}) = \{\mathbf{0}\}$, where $\sum_{j\ne i} E_{\lambda_j}$ is the sum of the eigenspaces of all distinct eigenvalues other than $\lambda_i$.
My current thought is that assume $\mathbf{v} \in E_{\lambda_i} \cap \sum_{j\ne i} E_{\lambda_j}$, then we have $T(\mathbf{v}) = \lambda_i\mathbf{v} $ as well as $T(\mathbf{v}) = \sum_{j \ne i} \lambda_j\mathbf{v}_j$, since $\mathbf{v} = \sum_{j\ne i} \mathbf{v}_j$, where each $\mathbf{v}_j$ is some eigenvector of $\lambda_i$. Then I can get $\mathbf{0} = -\lambda_i\mathbf{v}+\sum_{j \ne i} \lambda_j\mathbf{v}_j$. However I can't seem to somehow deduct from there that it must be $\mathbf{v}=\mathbf{0}$. I think I should probably use the fact that eigenvectors of distinct eigenvalues are linearly independent but I cannot see how to use it.
Suppose that
$E_{\lambda_i} \cap \displaystyle \sum_{j \ne i} E_{\lambda_j} \ne \{ \mathbf 0 \}; \tag 1$
then, as pointed out by our OP PsychoCom there is a vector
$\mathbf 0 \ne \mathbf v \in E_{\lambda_i} \cap \displaystyle \sum_{j \ne i} E_{\lambda_j}; \tag 2$
we thus have
$T\mathbf v = \lambda_i \mathbf v; \tag 3$
that is, $\mathbf v$ is an eigenvector corresponding to $\lambda_i$; and, since
$\mathbf v \in \displaystyle \sum_{j \ne i} E_{\lambda_j}, \tag 4$
we have
$\mathbf v = \displaystyle \sum_{j \ne i} \mathbf v_j, \tag 5$
where
$T\mathbf v_j = \lambda_j \mathbf v_j, \; \mathbf v_j \ne \mathbf 0; \tag 6$
that is, each $\mathbf v_j$ is an eigenvector corresponding to $\lambda_j$.
Now, it is a basic and well-known theorem that eigenvectors corresponding to distinct eigevalues are linearly independent; thus no linear relation such as (5) may bind; therefore
$E_{\lambda_i} \cap \displaystyle \sum_{j \ne i} E_{\lambda_j} = \{ \mathbf 0 \}. \tag 7$