Solving for eigenvectors of a $3\times3$ matrix

59 Views Asked by At

I got the eigenvalues which equal $4$, $-2$ and $-2$. The matrix is $$\begin{pmatrix}1 &-3& 3 \\3& -5 &3\\ 6 &-6 &4\end{pmatrix}$$

Now, usually, when I solve these, the second row always ends up automatically equaling zero. Is the same method applied when solving for a $3\times3$ matrix? Only ending up with the top row? Because I'm messing around with the matrix and I think that's possible.

Edit: I'm messing around with it and I don't think that is possible.

2

There are 2 best solutions below

1
On BEST ANSWER

The eigenvalues you got are correct. I'm not sure what you mean by one row always equaling zero. It is true that when you compute $A-\lambda I$ and row-reduce it, that you will get at least one zero row precisely when $\lambda$ is an eigenvalue. You may get more than one zero row (in the case you have given, you will) when the dimension of the eigenspace is greater than one.

0
On

Eigenvalues solve the equation $\det A-\lambda I = 0$, which means that the eigenvalues $\lambda$ are precisely the values where the matrix $A-\lambda I$ is singular, as any matrix with determinant zero is singular.

Whenever a matrix is singular, it has a null space of dimension greater than zero, which is another way of saying "row reduction induces a fully-zero row."

Similarly, eigenvectors solve $Ax = \lambda x$, or $(A-\lambda I)x = 0$. This is another way of saying that eigenvectors are precisely the vectors in the null space of $A-\lambda I$. If the null space of $A-\lambda I$ has dimension 1, then when solving $(A-\lambda I)x=0$, you will get a single zero row. If the null space has dimension 2, you'll get two such rows, and so on.

In fact, eigenvectors form a basis for the null space of $A-\lambda I$. So if you know what the eigenvalues are, then finding the eigenvectors is nothing more than computing the basis of the null space of $A-\lambda I$ for each eigenvalue $\lambda$.