Finding eigenvectors of a matrix

215 Views Asked by At

I want to find all eigenvalues and eigenvectors of the matrix $\begin{bmatrix}0&1&0\\0&0&1\\-1&0&0\end{bmatrix}$.

Here is how I find eigenvalues: $$\begin{align*} \det(A - \lambda I) &= \det \Bigg(\begin{bmatrix}0&1&0\\0&0&1\\-1&0&0\end{bmatrix} - \begin{bmatrix}\lambda&0&0\\0&\lambda&0\\0&0&\lambda \end{bmatrix} \Bigg)\\ &= \det \Bigg(\begin{bmatrix} -\lambda&1&0 \\ 0&-\lambda&1 \\ -1&0&-\lambda \end{bmatrix} \Bigg)\\ &= -\lambda^3 - 1\\ \therefore \lambda =& -1 \end{align*}$$

Using eigenvalue that I found ($-1$), I want to find eigenvectors: $$\begin{align*} (A - \lambda I)\vec{V} =& 0\\ \Bigg(\begin{bmatrix}0&1&0\\0&0&1\\-1&0&0\end{bmatrix} - \begin{bmatrix}-1&0&0\\0&-1&0\\0&0&-1\end{bmatrix}\Bigg) \begin{bmatrix}x\\y\\z \end{bmatrix} =& \begin{bmatrix}0\\0\\0\end{bmatrix}\\ \begin{bmatrix}1&1&0\\0&1&1\\-1&0&1\end{bmatrix} \begin{bmatrix} x\\y\\z \end{bmatrix} = & \begin{bmatrix}0\\0\\0\end{bmatrix}\\ \begin{bmatrix} x+y \\ y+z \\ -x+z \end{bmatrix} = & \begin{bmatrix}0\\0\\0\end{bmatrix}\\ \end{align*}$$

But what I should do from now? What is really the eigenvectors? Does this means that I have unlimited eigenvectors and any number that satisfies three equations can be eigenvectors?

3

There are 3 best solutions below

0
On BEST ANSWER

Since your characteristic equation is:

$$ \lambda^3 = -1 \rightarrow \lambda = e^{\pi i + \frac{2n\pi}{3}i} $$ and gives three distinct eigenvalues, there are exactly three eigenvectors only one of which has eigenvalue $\lambda = -1$.

$$ \begin{bmatrix} 1 & 1 & 0 \\ 0 &1& 1 \\ -1 &0 & 1 \end{bmatrix} \rightarrow \begin{bmatrix} 1 & 1 & 0 \\ 0 &1& 1 \\ 0 &1 & 1 \end{bmatrix} $$

Now the last two are degenerate (as we would expect) which gives:

$$ y = -z \\ x = -y = z \\ (z, -z, z) \rightarrow (1, -1, 1) $$

So $\left\langle1, -1, 1\right\rangle$ or $\left\langle \frac{1}{\sqrt{3}}, -\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}\right\rangle$ is the only eigenvector for $\lambda = -1$.

By only eigenvector, I mean that all eigenvectors for $\lambda = -1$ will be scalar multiples of the above.

0
On

Well, the polynomial $\lambda^3+1$ has two more (complex) roots, which means a rotation in a $2$ dimensional subspace.

In your last equation substitute $\lambda=-1$ and, say, $x=1$ to find one eigenvector.
(You are right: there are infinitely many eigenvectors if there is one as they always form a subspace.)

0
On

For any scalar $k$, if $v$ is an eigenvector for the eigenvalue $\lambda$ ($Av=\lambda v$) then so is $kv$ ($A(kv)=kAv=k\lambda v=\lambda(kv)$). The last line of your set of matrix equations is the homogeneous system $x+y=0$, $y+z=0$, $-x+z=0$. This system is not linearly independent, but that's OK. Doing a bit of solving gives you $x=z$ and $y=-z$, so $(x,y,z)=(z,-z,z)=z(1,-1,1)$. Yes, you get infinitely many eigenvectors, but they are all scalar multiples of eigenvector $(1,-1,1)$. This is normal -- you actually get a whole subspace of eigenvectors for any eigenvalue, with the dimension of the subspace corresponding to the multiplicity of the root in the characteristic polynomial. The vector $(1,-1,1)$ provides a basis for the eigenspace associated with your eigenvalue $\lambda=1$. (You should also note that, in your example, you also get two complex roots -- if your base field is $\mathbb C$ then you get eigenspaces for those values, too.) [Edit: fixed a typo]