Matrix diagonalization takes infinitely many operations?

448 Views Asked by At

I was reading chapter 7 (Tridiagonal Form) of the book The Symmetric Eigenvalue Problem by Parlett, and I came across a curious passage.

When explaining the advantages/usefulness of the tridiagonal form of symmetric matrices, he explains (with my comments in square brackets):

  1. The eigenvalues and eigenvectors of $T$ [a tridiagonal matrix] can be found with significantly fewer arithmetic operations than are required for a full $A$ [full as in every entry could be nonzero].

  2. Every $A$ [arbitrary matrix] can be reduced to a similar $T$ [tridiagonal] by a finite number of orthogonal similarity transformations. In principle, it requires an infinite number of transformations to diagonalize a matrix.

The problematic passage is in bold.

Not being very well versed in algorithms used to diagonalize matrices other than those used in early undergraduate linear algebra courses (i.e., find the roots of the characteristic polynomial and compute eigenvalues/eigenvectors by hand), I am puzzled by this statement.

Of course, I expect that diagonalizing a $n\times n$ matrix for large $n$ could take a very large amount of operations, but the claim that it can take infinitely many operations seems very counterintuitive to me.

1

There are 1 best solutions below

0
On BEST ANSWER

The eigenvalues of a matrix are the roots of the characteristic polynomial, and there is no "quintic equation" and no nice closed-form expression for the roots of a general polynomial of degree more than five. So, we must use iterative algorithms to compute eigenvalues of matrices. While these iterative algorithms may achieve high accuracy in a small number of iterations, infinitely many iterations would be required to get the exact eigenvalues.

This is in contrast to direct methods for solving linear systems, which (neglecting roundoff error) compute the exact solution in a finite number of steps.