QR algorithm for finding eigenvalues and eigenvectors of a matrix

1.2k Views Asked by At

Let $A$ be symmetric and diagonalizable, and let $\{\lambda_1, \cdots, \lambda_n\} $ be its spectrum. A consequence of the Spectral Theorem assures that $\exists Q$ orthogonal s.t.

$\begin{pmatrix} \lambda_1 & 0 &\cdots & 0\\ 0 & \lambda_2 & \cdots & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 & \cdots & \lambda_n \end{pmatrix} = D = Q^{-1} A Q = Q^T A Q.$

I want to apply the QR algorithm for finding the spectrum of A and an orthonormal basis of A, such that the matrix is orthogonal.
$A^{(0)}=A$
FOR $k = 1,2,...$
1) get the factorization $A^{(k-1)}= Q^{(k)}R^{(k)}$
2) $A^{(k)} = R^{(k)}Q^{(k)}$
3) $\overline{Q}^{(k)} = Q^{(1)}\cdot ...\cdot Q^{(k)} $

I've read several notes and books and now I am quite confused. In some it is assumed that all eigenvalues are distinct, in others only symmetry is assumed.
It is easy to see that all $A^{(k)}$ are similar, and therefore they have the same eigenvalues, but in some notes they say that $A^{(k)}$ converges to a diagonal matrix as $k \rightarrow \infty$, in others $A^{(k)}$ converges to a triangolar matrix. I couldn't find a formal proof to this in any case though.
Last but not least, in all notes it is said that the matrix $\overline{Q}^{(k)}$ converges to a basis of eigenvectors of $A$ and that it is orthogonal, but is this holding also without assuming that all eigenvalues are distinct? And where may I find a formal proof to all this?
Ayone who could help me in making it all clear and give me some good references?

1

There are 1 best solutions below

0
On
  • eigenvalues are distinct: the more relevant property would be that all eigenvalues are simple. This is guaranteed for symmetric or more generally normal matrices. This only has to do with convergence results, and has no influence in the considered case of symmetric matrices.

  • $A^{(k)}$ converges to a triangular matrix: this is the result for general matrices. For symmetric matrices $A^{(k)}$ stays symmetric for all $k$, so that "triangular" translates to "diagonal".

  • $\bar Q^{(k)}$ converges to a basis of eigenvectors of $A$: This is only true for diagonal matrices. For normal matrices, the complex eigenvalues result in $2×2$ diagonal blocks and the corresponding columns of the cumulative $\bar Q$ are real and imaginary parts of the pair of conjugate eigenvectors. In general where $A^{(k)}$ is increasingly triangular, the $\bar Q$ columns form a basis for an increasing sequence of invariant subspaces.

Stoer/Bulirsch wrote a book on numerical analysis, Watkins did a series on papers that can (could) be found online on Francis QR method and modern variants. In the netlib.org archives for the LAPACK package one can find research and white papers on the methods used.