Faster way to compute eigenvalues of a matrix doing rows operations

131 Views Asked by At

I had to find the eigenvalues of this matrix:

$\begin{bmatrix} -20 & -25 & -22 \\ 6 & 11 & 12 \\ 5 & 5 & 1 \end{bmatrix}$

I found the characteristic polynomial:

$P(\lambda) = -\lambda^3 -8\lambda^2 +29\lambda +180 $

Then I matched $P(\lambda) = 0 $ and with Ruffini's rule I solved my problem.

Can I reduce the given matrix in order to ease the computations? Is there a better way? Because I spent a lot of time trying all of the divisors of 180 to find a root.

I was thinking about add the first row with 5 times the third in order to set $a_{12} = 0 $ and then apply Laplace.

1

There are 1 best solutions below

0
On

First off, what you write down is not the characteristic polynomial, but minus the characteristic polynomial, and then with $\lambda$ (which traditionally stands for an eigenvalue, i.e., a scalar) substituted for the indeterminate which I shall take to be $X$). So $X^3+8X^3-19X-29X-180$ is the actual characteristic polynomial.

But gripes apart, there is a heuristic method to find this more easily, which works only in such fabricated cases as this one where there is a simple basis of integer-coefficient eigenvalues. It consists of writing for your matrix $A$ the matrix of which the characteristic polynomial is the determinant, that is $XI-A$, and doing row and column operations on it before actually computing the determinant. And one thing I have (again heuristically) found useful is focus on reducing large coefficient (not necessarily to $0$) and to work by conjugations: every time you add some multiple by $c$ of row $k$ to another row $l$, subsequently (or simultaneously, if you can get this right in your head) subtract the multiple by $c$ of column $l$ from column $k$. And vice versa. This will ensure that afterwards the occurrences of $X$ are again only on the diagonal.

So in your example, let us first reduce that coefficient $25$ by subtracting column $3$ from column $2$; then for good measure add row $2$ to row $3$: $$ \left|\matrix{X+20 & 25 & 22 \\ -6 & X-11 & -12 \\ -5 & -5 & X-1}\right| =\left|\matrix{X+20 & 3 & 22 \\ -6 & X+1 & -12 \\ -5 & -X-4 & X-1}\right| =\left|\matrix{X+20 & 3 & 22 \\ -6 & X+1 & -12 \\ -11 & -3 & X-13}\right| $$ That is a bit better, but not much. now we seem to be able to make more progress either by subtracting column $1$ from column $3$ or by adding row $3$ to row $1$; but these are two sides of a single conjugation, so let's do that (in the order given): $$ \left|\matrix{X+20 & 3 & 22 \\ -6 & X+1 & -12 \\ -11 & -3 & X-13}\right| =\left|\matrix{X+20 & 3 & -X+2 \\ -6 & X+1 & -6 \\ -11 & -3 & X-2}\right| =\left|\matrix{X+9 & 0 & 0 \\ -6 & X+1 & -6 \\ -11 & -3 & X-2}\right| $$ Weren't we lucky, two coefficients $0$ created and we did not even directly try to do so. At this point we could easily compute and factor the characteristic polynomial; the factor $X+9$ is evident.

But with such good fortunes so far, let's see if we can continue a bit. Subtracting column $3$ from column $1$ looks promising, especically since then adding row $1$ to row $3$ creates yet another $0$. This time I'll do both at once $$ \left|\matrix{X+9 & 0 & 0 \\ -6 & X+1 & -6 \\ -11 & -3 & X-2}\right| =\left|\matrix{X+9 & 0 & 0 \\ 0 & X+1 & -6 \\ 0 & -3 & X-2}\right|. $$ Great, although not really a big step towards computing the characteristic polynomial. After a bit of thought, now add column $2$ to column $3$ and subtract row $3$ from row $2$, giving $$ \left|\matrix{X+9 & 0 & 0 \\ 0 & X+1 & -6 \\ 0 & -3 & X-2}\right| =\left|\matrix{X+9 & 0 & 0 \\ 0 & X+4 & 0 \\ 0 & -3 & X-5}\right|. $$ Now the factorisation of the characteristic polynomial is evident.

We can even get to a completely diagonal form, killing the off-diagonal $-3$, by subtracting $c$ times column $3$ from column $2$ then adding $c$ times row $2$ to row $3$, for $c=\frac13$. All in all, we have explicitly conjugated $A$ to a diagonal matrix with diagonal entries $-9,-4,5$. The matrices we used to right-multiply by (and then left-multiply by their inverse) were ones differing from identity by just one off-diagonal coefficient, namely successively $$ \pmatrix{1&0&0\\0&1&0\\0&-1&1}, \pmatrix{1&0&-1\\0&1&0\\0&0&1}, \pmatrix{1&0&0\\0&1&0\\-1&0&1}, \pmatrix{1&0&0\\0&1&1\\0&0&1}, \text{and} \pmatrix{1&0&0\\0&1&0\\0&-\frac13&1} $$ Altogether we have found the diagonal matrix as $P^{-1}AP$ where $P$ is the product of those matrices, which is $$ P=\pmatrix{2&\frac13&-1\\0&\frac23&1\\-1&-1&0}. $$ You can check that the columns of $P$ are eigenvectors of $A$. And we didn't even solve any linear system to find them.