Is it possible for an eigenspace to have more than one vector in its basis? What would that imply? Would every vector in the basis be an eigenvector?

1.6k Views Asked by At

So far I've been seeing that the vector that makes up a basis for the null space of the matrix $A-Ix$ (where x is an eigenvalue) is the eigenvector corresponding to the eigenvalue $x$. But I've never run into a situation where the basis had more than one vector in it.

Let's say the basis of the null space of $A-2*I$ was made up of two vectors. Would that mean those two (and only those two) vectors are the eigenvectors corresponding to the eigenvalue $2$ for the matrix $A$?

One thing of note is that, from what I understand, the identity transformation (in any space) has only one eigenvalue, $1$, but it has infinitely many eigenvectors (not sure if there is a vector space where this wouldn't be true, but it's true for $R^n$ at least). Not sure how this fact fits in here.

Just wanting to be sure I'm not misunderstanding something. I'm grateful for any help.

2

There are 2 best solutions below

0
On

Yes of course, you can have several vectors in the basis of an eigenspace.

First, when you have only one vector $v$ in a basis for a matrix $A$, with eigenvalue $\mu$, then any multiple of this vector is in the basis : $$ \forall c \in \mathbb{R}, \ A(cv)=\mu(cv)$$

Then if you have two vectors in a basis, $v$ and $w$, any linear combination of these two vector will be an eigenvector for $\mu$: $$ \left\{ \begin{array}{l} Av=\mu v\\ Aw=\mu w\\ \end{array} \right. \Rightarrow A(cv+dw) = \mu (cv+dw)$$

For exemple, let $A=J-I$ a matrix $n\times n$ of all 1, except 0 in the diagonal (this exemple comes from graph theory and the complete graph $K_n$). Then $A$ has eigenvalues $n-1$ and $-1$, and while the eigenspace associated with $n-1$ has dimension $1$ (there is only 1 vector in its basis), the eigenspace associated with the eigenvalue $-1$ has dimension $n-1$. Hence you can find an orthogonal basis of $n-1$ vectors. Any vector $v$ verifying $Av=-v$ can be written as a linear combination of the vectors in this basis.

0
On

You are understanding things just right.

For example, the $3 \times 3$ matrix $$ \begin{bmatrix} 2 & 0 & 0 \\ 0 & 3 & 0 \\ 0 & 0 & 3 \end{bmatrix} $$ has two eigenvalues, $2$ and $3$.

Every nonzero vector on the $x$-axis is an eigenvector for eigenvalue $2$, and a basis for that eigenspace.

Every nonzero vector in the $y$-$z$ plane is an eigenvector for eigenvalue $2$. Any two linearly independent vectors in that plane form a basis of eigenvectors for that eigenvalue.