If $v_1,...,v_r$ are eigenvectors that correspond to distinct eigenvalues, then they are linearly independent.

877 Views Asked by At

Prove:

If $v_1,...,v_r$ are eigenvectors that correspond to distinct eigenvalues $\lambda_1, ...,\lambda_r$ of an $n \times n$ matrix $A$, then the set $\{v_1,...,v_r\}$ is linearly independent.

Please give an example and tell me how this theorem works!

3

There are 3 best solutions below

3
On

Remember that $\;\{v_1,...,v_r\}\;$ are lin. dependent iff there is $\;1\le i\le r\;$ s.t. $\;v_i\;$ lind. dep. on $\;v_1,...,v_{i-1}\;$, so let $\;i\;$ be the first such index for which this happens:

$$(1)\;\;v_i=\sum_{k=1}^{i-1}a_kv_k\implies\lambda_iv_i=\sum_{k=1}^{i-1}a_k\lambda_iv_k$$

$$(2)\;\;\lambda_iv_i=Av_i=\sum_{k=1}^{i-1}a_kAv_k=\sum_{k=1}^{i-1}a_k\lambda_kv_k$$

Now substract right sides of (1)-(2) and...end the argument.

3
On

Here is a slightly different proof than usual for fields $\mathbb{R}$ or $\mathbb{C}$:

Suppose we order the eigenvalues so that $|\lambda_1| > |\lambda_2| > \cdots > |\lambda_r|$.

Suppose $v=\sum_k \alpha_k v_k = 0$, where the $v_k$ correspond to the $\lambda_k$. Then $({1 \over \lambda_1} A)^m v = \sum_k \alpha_k ({\lambda_k \over \lambda_1})^m v_k = 0$. Then $\lim_m ({1 \over \lambda_1} A)^m v = \alpha_1 v_1 = 0$ shows that $\alpha_1 = 0$.

Now repeat the process with $({1 \over \lambda_2} A)^m v$, etc.

I'm not sure what you mean by an example, but you could take $A=\operatorname{diag}(1,2,...,n)$, then the eigenvectors are $e_1,...,e_n$ which are obviously linearly independent.

0
On

Another way to look at it,

You want to show that $c_1v_1+c_2v_2+\dots+c_rv_r=0$ implies each $c_i$ is 0. So, you know for each eigenvector $v_i$, $\lambda_iv_i = Av_i$ s.t. $(A-\lambda_iI)v_i=0$. Multiply each side of $\sum^r_{i=1}c_iv_i=0$ by $(A-\lambda_1I)(A-\lambda_2I)\cdots(A-\lambda_{i-1}I)(A-\lambda_{i+1}I)\cdots(A-\lambda_rI)$ which will show $c_i$ is 0. Repeat the process for each $c_i$.