The behavior of vectors that cannot be written as a linear combination of eigenvectors for the recursive equation $x_{k+1}=Ax_k$.

64 Views Asked by At

I have a question regarding the sequence of vectors given by
$x_{k+1}=Ax_k, \quad k=0,1,2,...$
with initial vector $x_0$. If the initial condition can be written as a linear combination of eigenvectors of $A$, then the long-term behavior can be understood quite easily since
$x_k=A^k x_0=A^k(c_1v_1+...+c_pv_p)=c_1\lambda_1^k v_1 + ... + c_p \lambda_p^k v_p$
From this equation we can look at the eigenvector with the largest eigenvalue in absolute value to understand how the system will evolve as $k$ increases. My question concerns the nature of the system if the initial condition cannot be written as a linear combination of eigenvectors (A is not diagonalizable). In this case how would we expect the system to evolve? Would it converge to something that can be written as a linear combination of eigenvectors or does something entirely different happen, if it can even be determined?

1

There are 1 best solutions below

0
On BEST ANSWER

Here's an approach to the problem that's very similar to that outlined in Will Jagy's comment.

You can always express the initial vector as a linear combination of generalized eigenvectors. A generalized eigenvector is a non zero vector $\ v\ $ such that $\ (A-\lambda I)^nv=0\ $ for some eigenvalue $\ \lambda\ $ and natural number $\ n\ $. For each eigenvalue $\ \lambda\ $ there exists one or more sequences of linearly independent generalised eigenvectors $\ w_0, w_1,\dots, w_r\ $ such that

  • $\ w_0\ $ is an eigenvector corresponding to the eigenvalue $\ \lambda\ $;
  • $\ (A-\lambda I)w_i=w_{i-1}\ $ for $\ i=1,2,\dots, r\ $; and
  • $\ w_r\not\in\text{ colspace}(A-\lambda I)\ $.

It's not difficult to show by double induction on $\ i\ $ and $\ n\ $ that $$ A^nw_i=\sum_{j=0}^i\lambda^{n-j}{n\choose j}w_{i-j}\ . $$ Thus, if $\ x_0=c_1v_1+c_2v_2+\dots+c_pv_p\ $ with $\ c_i\ne0\ $ and $\ v_i\ $ now generalised eigenvectors, then the asymptotic behaviour of $\ A^nx_0\ $ will still be determined by looking at the generalised eigenvectors corresponding to the eigenvalue with the largest magnitide. If there's a sequence $\ w_0, w_1,\dots, w_r\ $ of generalised eigenvectors of the form described above corresponding to that eigenvalue, the dominating term will be $\ \lambda^{n-r}{n\choose r}w_0\ $.

If $\ v_1,v_2,\dots,v_q\ $ is a basis of generalised eigenvectors of the form described above, arranged in an appropriate order, then the matrix of the linear transformation $\ x\mapsto Ax\ $ with respect to that basis is a Jordan normal form of $\ A\ $: $$ J=P^{-1}AP\ , $$ where $\ P\ $ is the matrix with columns $\ v_1,v_2,\dots, v_q\ $. You then have $$ A^nx_0=PJ^nP^{-1}x_0\ , $$ and this is probably the most straightforward way of tackling the problem.