Can two eigenvectors share one eigenvalue?

1k Views Asked by At

I apologize in advance for the mess. This is my attempt.
The matrix in question is

\begin{align} \textbf{A} = \frac{1}{100}\cdot\begin{pmatrix} 92 & 0 & -144 \\ 0 & 100 & 0 \\ -144 & 0 & 8 \end{pmatrix}.\end{align} We have to find the normalised eigenvector $v = \begin{pmatrix} v_{1} \\ v_{2} \\ v_{3} \end{pmatrix}$ for the eigenvalue $\lambda = -1$, such that $\lvert v \rvert = 1 $.

I solved the equation $$(\textbf{A} - \lambda \textbf{I})\cdot v = 0$$ i.e. $$\begin{bmatrix} 192 & 0 & -144 \\ 0 & 200 & 0 \\ -144 & 0 & 108 \end{bmatrix}\cdot\begin{bmatrix}v_{1} \\ v_{2} \\ v_{3}\end{bmatrix} = 0$$ I used Gaussian elimination and found that $$v = \begin{bmatrix}\frac{3}{4}v_{3} \\ 0 \\ v_{3} \end{bmatrix}.$$ Using the fact that $\lvert v \rvert = 1$, I found that $$ 1 = \Big(\frac{3}{4}v_{3}\Big)^{2} + \Big(v_{3}\Big)^{2} \implies v_{3} = \pm \frac{4}{5}.$$ I tried plugging both $\begin{bmatrix}\frac{3}{5} \\ 0 \\ \frac{4}{5} \end{bmatrix}$ and $\begin{bmatrix} -\frac{3}{5} \\ 0 \\ -\frac{4}{5} \end{bmatrix}$ in the original equation and they both work. So I'm wondering if it's possible that matrix $\textbf{A}$ has two eigenvectors for the eigenvalue $\lambda = -1$, or is the sign irrelevant? If so, which one should I consider?
Any help or suggestions is appreciated.

3

There are 3 best solutions below

3
On BEST ANSWER

Yes, there can be more than one eigenvector associated with a certain eigenvalue; we call the number of eigenvalues of a eigenvalue its geometric multiplicity $m_g(\lambda)$.

The eigenvectors need to be linearly independent however for them to count as two (or more) different ones. In your case, one is a scalar multiple of $-1$ of the other and thus not linearly independent so they are they 'same' eigenvector, i.e. there is only one for the eigenvalue $\lambda = -1$.

In general, the number of eigenvectors $m_g(\lambda)$ corresponding to a eigenvalue is the dimension of the eigenspace $E_\lambda$ which is spanned by the eigenvectors $\xi_1, \xi_2, \ldots$. If you get two eigenvectors which are linearly dependent, the dimension of $E_\lambda$ does not change so we do not consider it as a different one.

As an example of a matrix with 2 eigenvectors belonging to one eigenvalue, consider the matrix $$A = \begin{pmatrix} 2 &0\\ 0&2 \end{pmatrix}.$$ Since $A$ is upper triangular, its eigenvalues $\lambda$ are given by the elements on the diagonal which are both $2$. If we calculate the eigenvectors corresponding to $2$ we get two linearly independent ones (check this!) \begin{align*} \left\{\begin{pmatrix} 1\\ 0 \end{pmatrix}, \begin{pmatrix} 0\\ 1 \end{pmatrix}\right\}. \end{align*}

2
On

Great observation! Yes, it is absolutely possible for a matrix to have two eigenvectors corresponding to the same eigenvalue. We know that for each distinct eigenvalue there will be at least one eigenvector, but there could certainly be more. For example, if our matrix is the $n\times n$ identity matrix then every vector in $\mathbb{R}^n$ is an eigenvector corresponding to the eigenvalue 1.

Why does this matter? Well one important result is that if an $n\times n$ matrix has $n$ distinct eigenvalues then it is diagonalizable.

0
On

A real matrix has infinite eigenvectors for an eigenvalue. In fact the set of all eigenvectors relative to an eigenvalue is a vector subspace(if $v$ and $w$ are eigenvector w.r.t an eigenvalue $\lambda$, $A(w+v)=Aw+Av=\lambda w+ \lambda v=\lambda (w+v)$ and $A(kw)=\lambda kw$, so $w+v$ and $kw$ are also eigenvectors w.r.t $\lambda$).