Any vector is a linear combination of the eigenvectors ? [Strang P296, 6.1.25]

5.4k Views Asked by At

Suppose $A$ and $B$ have the same eigenvalues $\lambda_1, \cdots, \lambda_n $ with the same independent eigenvectors $\mathbf{x_1, \cdots, x_n}$. Then $A = B$. Reason: Any vector $\mathbf{x}$ is a combination $\sum_{1 \le j \le n} c_j\mathbf{x_j}$. What are $A\mathbf{x}$ and $B\mathbf{x}$?

Solution (cp P4 of 6 of this PDF):


enter image description here


Similarly, $B\mathbf{x} = \sum\limits_{1 \le j \le n} c_j\lambda_j\mathbf{x_j} $. Then conclude by virtue of If $Ax = Bx$ for all $x \in C^{n}$, then $A = B$.

  1. The question wrote $\mathbf{x} = \sum\limits_{1 \le j \le n} c_j\mathbf{x_j}$. What is $x$? Any vector in $\mathbb{C^n}$?
    If so, how can every possible vector be written as a linear combination of the eigenvectors?

  2. What's the intuition behind the above result : Two matrices are the same when they have the same eigenvalues and eigenvectors?

3

There are 3 best solutions below

1
On
  1. Yes, $x$ can be any vector in $\mathbb C^n$. If the $x_i$ are independent eigenvectors, they form a basis for the space (because there are $n$ of them), so every vector can be written as a unique linear combination of the $x_j$.

  2. The result says that if the linear transformations $A$ and $B$ have the same eigenvalues and corresponding eigenvectors, and the eigenvectors are independent, then $A$ and $B$ are the same.

2
On

I assume you're working over $K^n$. The problem is assuming that $A$ and $B$ are diagonalizable, and that for each eigenvalue $\lambda_i$, we have $\ker(A-\lambda_i 1)=\ker(B-\lambda_i 1)$. Under such hypothesis, the claim follows, for if $A,B$ are both diagonalizable, the eigenspaces $\ker(A-\lambda_i 1)$ sum to the whole space, and since they coincide, we may exhibit a base of eigenvectors $v_1,\ldots,v_n$ of $K^n$. Then indeed any $x$ can be written uniquely as a sum of basis elements $\sum x_iv_i$ and $$Ax=\sum_{i=1}^n x_i Av_i=\sum_{i=1}^n x_i\lambda_iv_i=\sum_{i=1}^n x_i Bv_i=Bx$$

0
On

This result seems weird without saying that $n$ is the dimension of the vector space and in this case the matrices are diagonalizable, otherwise we can find easily a counter example:

$$A=\left(\begin{array}\\ 1&1\\ 0&1 \end{array}\right)\quad;\quad B=\left(\begin{array}\\ 1&2\\ 0&1 \end{array}\right)$$ have the same eigenvalue $1$ and the same eigenvector but clearly $A\ne B$.