Convergence of the powers of stochastic matrices

325 Views Asked by At

I want prove that for a $n \times n$ left stochastic matrix $P$ with dominant eigenvector $v$ and any nonzero vector $x$ with non-negative entries, $P^kx \to \alpha v$ as $k\to\infty$ for some real $\alpha > 0$.

I know how to do the proof when $P$ is diagonalizable. Because that way I can write $x = \sum_i \alpha_i v_i$ as a linear combination of the eigenvectors. Then $$ P^kx = \sum_i \alpha_i\lambda_i^k v_i $$ But all $|\lambda_i| <1$ except for the dominant eigenvalue of $P$, which is $1$. So all summands $\to 0$ except for the term that has the dominant eigenvalue. So $P^k x \to \alpha v$ for some $\alpha$.

$\alpha > 0$ because $0 < \frac{eP^k x}{e v} \to \alpha $, where $e$ is a $1 \times n$ vector whose entries are all $1$'s.

But how can I express $x$ if $P$ is diagonalizable? Can someone give a hint? Thanks in advance!

EDIT: Following the hint by @StratosFair, I came up with the following proof. Since $P$ is a stochastic matrix, by the Jordan form theorem, we can find an orthogonal matrix $U$ such that $P = U^{-1}AU$, where $A$ is the Jordan form of $P$ that is block diagonal. Then $$ P^kx = U^{-1}A^k U x $$ But all $|\lambda_i| <1$ except for the dominant eigenvalue of $P$, which is $1$. So all diagonal blocks of $A$ $\to 0$ except for the one that has the dominant eigenvalue. So $P^k x \to U^{-1}BU = \alpha v$ for some $\alpha$, where $B$ denotes a matrix that has $1$ for the top left entry and $0$ for the others. But I don't really see why $U^{-1}BU = \alpha v$, i.e. where does the dominant eigenvector come into play. But it has to be true in order for the proof to work. Can someone point out why this is the case? Thanks in advance!

1

There are 1 best solutions below

2
On

The statement that @InsultedbyMathematics wrote is a theorem that is in Peter Lax's Linear Algebra book, on page 200, theorem 3.