Power Method: Showing convergence to dominant eigenvector

96 Views Asked by At

What follows is taken from Numerical Analysis, by R. Burden and D. Faires:

Let $A\in \mathbb{R}^{n\times n}$, with eigenvalues $\lambda_1,\dots,\lambda_n$ such that $|\lambda_1|>|\lambda_2|\ge |\lambda_3|\ge\dots\ge |\lambda_n|$. Let $\mathbf{v}^{(1)},\dots,\mathbf{v}^{(n)}$ be a set of $n$ linearly independent eigenvector, where $\mathbf{v}^{(i)}$ is associated to $\lambda_i$.

Consider a unitary vector $\mathbf{x}^{(0)}$ whose coordinate in the direction of $\mathbf{v}^{(1)}$ is non-zero, this is:

$$\mathbf{x}^{(0)}=\sum_{j=1}^n\beta_j\mathbf{v}^{(j)}$$

with $\beta_1\neq 0$.

Define:

$$\mathbf{y}^{(m)}:=A\mathbf{x}^{(m-1)}$$

$$\mu^{(m)}:=y^{(m)}_{m-1}=\lambda_1\left [ \frac{\beta_1v^{(1)}_{p_{m-1}}+\sum_{j=2}^n\left ( \frac{\lambda_j}{\lambda_1}\right )^m\beta_jv^{(j)}_{p_{m-1}}}{\beta_1v^{(1)}_{p_{m-1}}+\sum_{j=2}^n\left ( \frac{\lambda_j}{\lambda_1}\right )^{m-1}\beta_jv^{(j)}_{p_{m-1}}}\right ]$$

$$\mathbf{x}^{(m)}:=\frac{\mathbf{y}^{(m)}}{y^{(m)}_{p_m}}=\frac{A^mx^{(0)}}{\prod_{k=1}^my^{(k)}_{p_k}}$$

where at each step, $p_m$ is used to represent the smallest integer for which:

$$\left |y_{p_m}^{(m)}\right |=\left \| \mathbf{y}^{(m)}\right \|_\infty$$

and, for any vector $\mathbf{w}$, $w_k$ is the $k$-th entry of $\mathbf{w}$.

My question is: Why do the vectors $\mathbf{x}^{(m)}$ converge to an eigenvector associated to $\lambda_1$? Burden offers no proof whatsoever of this fact, and simply states that, from the equation defining $\mu^{(m)}$ "we see that, since $|\lambda_j/\lambda_1|<1$ for each $j=2,3,\dots,n,\displaystyle \lim_{m\to\infty}\mu^{(m)}=\lambda_1$, provided that $\mathbf{x}^{(0)}$ is chosen so that $\beta_1 \neq 0$. Moreover, the sequence of vector $\{x^{(m)}\}_{m=0}^\infty$ converges to an eigenvector associated with $\lambda_1$ what has $l_{\infty}$ norm equal to one."

I see why $\mu^{(m)}\to \lambda_1$, but I'm lost as to why $\mathbf{x}^{(m)}$ must converge to an eigenvector corresponding to $\lambda_1$.

Thanks in advance!

1

There are 1 best solutions below

0
On

Although Burden and Faires might not have explicitly shown that the sequence of vectors $\{\mathbf{x}^{(m)}\}$ converges to a unit vector in the direction of $\mathbf{v}^{(1)}$, we can show this is true by using the various facts and definitions that you’ve written down.

To start, we establish that $\mathbf{x}^{\left( m \right)} = \frac{{A^m \mathbf{x}^{\left( 0 \right)} }}{{\left\| {A^m \mathbf{x}^{\left( 0 \right)} } \right\|_\infty }}(\prod\limits_{j = 1}^m {c_j })$ where $c_j=-1$ if $y_{p_j}^{(j)} < 0$ and $c_j=1$ otherwise. This will show that $\{\mathbf{x}^{(m)}\}$ is a sequence of unit vectors in the $\infty$-norm. This is clear for $m=1$ (using your definition for $\mathbf{x}^{(m)}$), and we proceed by induction.

From what you’ve written down, we have that $$\mathbf{x}^{(m)}:=\frac{\mathbf{y}^{(m)}}{y^{(m)}_{p_m}}= \frac{c_m{\mathbf{y}^{\left( m \right)} }}{{\left\| {\mathbf{y}^{\left( m \right)} } \right\|_\infty }} = \frac{c_m{A\mathbf{x}^{\left( {m - 1} \right)} }}{{\left\| {A\mathbf{x}^{\left( {m - 1} \right)} } \right\|_\infty}}.$$

From induction, we also have that $\mathbf{x}^{\left( m-1 \right)} = \frac{{A^{m -1}\mathbf{x}^{\left( 0 \right)} }}{{\left\| {A^{m-1} \mathbf{x}^{\left( 0 \right)} } \right\|_\infty}}(\prod\limits_{j = 1}^{m-1} {c_j})$ , so we can substitute this into the previous equation to get

$$\mathbf{x}^{\left( m \right)} = \frac{c_m{A\frac{{A^{m - 1} \mathbf{x}^{\left( 0 \right)} }}{{\left\| {A^{m - 1} \mathbf{x}^{\left( 0 \right)} } \right\|_\infty }}}(\prod\limits_{j = 1}^{m-1} {c_j})}{{\left\| {A\frac{{A^{m - 1} \mathbf{x}^{\left( 0 \right)} }}{{\left\| {A^{m - 1} \mathbf{x}^{\left( 0 \right)} } \right\|_\infty}}(\prod\limits_{j = 1}^{m-1} {c_j})} \right\|_\infty}} = \frac{{\frac{{A^m \mathbf{x}^{\left( 0 \right)} }}{{\left\| {A^{m - 1} \mathbf{x}^{\left( 0 \right)} } \right\|_\infty}}}(\prod\limits_{j = 1}^{m} {c_j})}{{\left\| {\frac{{A^m \mathbf{x}^{\left( 0 \right)} }}{{\left\| {A^{m - 1} \mathbf{x}^{\left( 0 \right)} } \right\|_\infty }}} \right\|_\infty }} = \frac{{A^m \mathbf{x}^{\left( 0 \right)} }}{{\left\| {A^m \mathbf{x}^{\left( 0 \right)} } \right\|_\infty}}(\prod\limits_{j = 1}^{m} {c_j}).$$

Therefore we’ve shown that $\{\mathbf{x}^{(m)}\}$ is a sequence of unit vectors in the $\infty$-norm and have given an expression for its members involving $A$ and $\mathbf{x}^{(0)}$.

Using your expression for $\mathbf{x}^{(0)}$, we have that

$$A^m\mathbf{x}^{(0)}=\lambda _1^m \left\{ {\beta _1 \mathbf{v}^{\left( 1 \right)} + \sum\limits_{j = 2}^n {\left( {\frac{{\lambda _j }}{{\lambda _1 }}} \right)^m \beta _j \mathbf{v}^{\left( j \right)} } } \right\}.$$

Substituting this expression into $\mathbf{x}^{\left( m \right)} = \frac{{A^m \mathbf{x}^{\left( 0 \right)} }}{{\left\| {A^m \mathbf{x}^{\left( 0 \right)} } \right\|_\infty }}(\prod\limits_{j = 1}^m {c_j })$ and letting $m$ grow large, we see that the summation term tends to zero and so the sequence $\{\mathbf{x}^{(m)}\}$ does indeed converge to a unit eigenvector corresponding to the eigenvalue $\lambda _1$ (i.e., it tends to a unit multiple of $\frac{{\mathbf{v}^{\left( 1 \right)} }}{{\left\| {\mathbf{v}^{\left( 1 \right)} } \right\|_\infty}}$).