Let $T$ be a linear operator on a finite-dimensional vector space $V$, and suppose that the distinct eigenvalues of $T$ are $λ_1, λ_2, \ldots, λ_k.$ Prove that
$\operatorname{span}(\{x ∈ V: x \text{ is an eigenvector of }T\}) = E_{λ_1} ⊕ E_{λ_2} ⊕· · ·⊕E_{λ_k}.$
Let $\{x_{k1},x_{k1},\ldots,x_{kn_k}\}$ eigenvectors corresponding to $\lambda_k$. Then,
$span(\{x ∈ V: x $is an eigenvector of $T\})=a_{11}x_{11}+a_{12}x_{12}+\cdots+a_{1n_1}x_{1n_1}+a_{21}x_{21}+a_{22}x_{22}+\cdots+a_{2n_2}x_{2n_2}+\cdots+a_{k1}x_{k1}+a_{k2}x_{k2}+\cdots+a_{kn_1}x_{kn_k}=E_{λ_1} + E_{λ_2} +\cdots+E_{λ_k}$.
Now, we need to prove $E_{\lambda_j}\cap\sum_{i\neq j} E_{\lambda_i}=\{0\}$. Let $x\neq 0, x\in E_{\lambda_j}\cap\sum_{i\neq j} E_{\lambda_i}\implies x\in E_{\lambda_j}$ and $x\in\sum_{i\neq j} E_{\lambda_i}.$ $x=a_{j1}x_{j1}+a_{j2}x_{j2}+\cdots+a_{jn_j}x_{jn_j}$ and $x=\sum_{i=1,i\neq j}^{k}\sum_{p=1}^{n_i}a_{ij}x_{ij}\implies a_{j1}x_{j1}+a_{j2}x_{j2}+\cdots+a_{jn_j}x_{jn_j}=\sum_{i=1,i\neq j}^{k}\sum_{p=1}^{n_i}a_{ij}x_{ij}\implies a_{j1}x_{j1}+a_{j2}x_{j2}+\cdots+a_{jn_j}x_{jn_j}-\sum_{i=1,i\neq j}^{k}\sum_{p=1}^{n_i}a_{ij}x_{ij}=0.$ We don't know that $\{x_{ij}\}$ are linearly independent. How do I complete the proof? Please help me.
Here is a way that I found in a textbook that I like. Another way to show what we want is as follows: suppose that $v_1,\dots,v_k$ are eigenvectors with distinct eigenvalues $\lambda_1,\dots,\lambda_k$ (i.e. $v_1\in E_{\lambda_1},\dots,v_k\in E_{\lambda_k}$). We seek to show that $\sum a_iv_i=0 \implies a_1=\cdots=a_k=0$. So suppose that $\sum a_jv_j=0$. Now, for each $i$ consider the operater $g_i(T) = (\frac{T-\lambda_i}{\lambda_1-\lambda_i})\cdots \widehat{\frac{T-\lambda_i}{\lambda_i-\lambda_i}}\cdots(\frac{T-\lambda_i}{\lambda_k-\lambda_i})= \displaystyle\prod_{i\neq j} \frac{T-\lambda_i}{\lambda_j-\lambda_i}$. Then we will get that $0=g_i(T)(\sum a_jv_j)=\sum a_j(g_i(T)(v_j))=\sum \cases{0 & $i \neq j$ \\ a_iv_i &$i=j$}=a_iv_i$. This implies that $a_i = 0$.