Power iteration

1.3k Views Asked by At

If $A$ is a matrix you can calculate its largest eigenvalue $\lambda_1$. What are the exact conditions under which the power iteration converges? Power iteration

Especially, I often see that we demand that the matrix is symmetric? Is this necessary? What seems to be indespensable is that there is a largest eigenvalue (absolute value is large). But what about the structure of the eigenspace of this eigenvalue?

Apparently, many times it is not considered that the eigenspace to this largest eigenvalue does not need to be one-dimensional. So what happens if the eigenspace is two-dimensional. Can we still use this algorithm?

2

There are 2 best solutions below

1
On BEST ANSWER

A sufficient condition for convergence is that (1) there is exactly one eigenvalue with the maximal absolute value of all eigenvalues, and (2) the starting vector has non-zero component in the associated eigenspace.

If $\lambda_1\dots \lambda_m$ are the distinct eigenvalue, condition (1) means: $$ |\lambda_1| > |\lambda_j| $$ for all $j=2\dots m$. It is not necessary to assume that the eigenspace is one-dimensional. The Wiki-page already has the convergence proof.

Surb's second matrix $A$ is not a counterexample, as $A$ has eigenvalues $+1$ and $-1$.

The assumptions are also not fulfilled for a real matrix without real eigenvalues. If the starting vector is pure real, then all iterates are pure real, and cannot generate a non=real eigenvalue.

1
On

Let $$M:=\begin{pmatrix}0 & 1 \\ 1 & 0\end{pmatrix},\ \text{then}\quad M \begin{pmatrix}1\\ 1 \end{pmatrix}=1\cdot \begin{pmatrix}1\\ 1 \end{pmatrix},\ \text{and} \quad M \begin{pmatrix}1\\ -1 \end{pmatrix}=-1\cdot \begin{pmatrix}1\\ -1 \end{pmatrix} $$ Now let $x^0 =(x^0_1,x^0_2) \neq (0,0)$, then the sequence $x^{k+1} = Mx^k$ will never converge since $M$ just switch the coordinates of $x^k$. If you want a matrix with maximal eigenvalue (without absolute value) is not simple and the algorithm don't converge, consider $$A = \begin{pmatrix}0 & 1 & 0 \\ 1 &0 & 0 \\ 0& 0 &1\end{pmatrix}, \quad A\begin{pmatrix}1 \\ 1 \\ 0 \end{pmatrix}=1\cdot \begin{pmatrix}1 \\ 1 \\ 0 \end{pmatrix},\quad A\begin{pmatrix}1 \\ -1 \\ 0 \end{pmatrix}=-1\cdot \begin{pmatrix}1 \\ -1 \\ 0 \end{pmatrix},\quad A\begin{pmatrix}0 \\ 0 \\ 1 \end{pmatrix}=1\cdot \begin{pmatrix}0 \\ 0\\ 1 \end{pmatrix}, $$ and the sequence $x^{k+1} = Ax^k$ will never converge except if you start directly with a eigenvector.