Power of a Nonsymmetric Matrix

1k Views Asked by At

I have a matrix $M$ and a vector $x$. I would like to compute the vector $y= M^n x$.

I found the eigenvalues $λ_k$ of $M$ and the left and right eigenvectors, properly normalized. If M was symmetric, I would now project $x$ on each eigenvector, multiply by $λ_k^n$, and then recombine them. That would give me $y$.

But $M$ is not symmetric, Is it any useful that I have the left and right eigenvectors? Would you know how to compute $y$?

EDIT (in answer to Florian below): the matrix whose rows are the right eigenvectors of $M$ is too hard to invert, so my question is whether one can to bypass the inversion by using the fact that one knows both sets of eigenvectors?

2

There are 2 best solutions below

7
On

Is $M$ diagonalizable? If so, you can write $M$ as $M=P \cdot \Lambda \cdot P^{-1}$, where $P$ is a matrix containing the eigenvectors of $M$ and $\Lambda$ is a diagonal matrix containing the eigenvalues $\lambda_k$. In this case, we have that $M^n = P \cdot \Lambda^n \cdot P^{-1}$. Based on this result you can do what you intended: "project" $x$ by computing $P^{-1} \cdot x$, then multiply by $\lambda_k^n$ and then recombine using $P$.

This is possible if $M$ is diagonalizable which is true if and only if for each eigenvalue of $M$, its geometriy and algebraic multiplicity agree.

2
On

If you have a complete basis of right eigenvectors for $M$, say $\{v_1,v_2,\ldots,v_\ell\}$ such that $Mv_k = \lambda_k v_k$, then finding $M^n x$ would be easy if we knew the linear combination $x = \sum_{k=1}^\ell c_k v_k$:

$$ M^n x = \sum_{k=1}^\ell c_k \lambda_k^n v_k $$

The OP asks in an edit whether also knowing the left eigenvectors will help us to get that linear combination. In the present Question the OP stipulated in the Comments that these were complete bases and that the multiplicity of each eigenvalue is $1$.

The fact is that left and right eigenvectors of distinct eigenvalues are orthogonal. This sort of thing has been shown in previous Questions here, but often the discussion is confounded by additional matrix properties (like normality) that are unnecessary. So let's give the one-line proof.

Suppose a right eigenvector $v$ with $Mv = \lambda v$ and a left eigenvector $u^T$ with $u^T M = \mu u^T$. If $\lambda \neq \mu$, then:

$$ \lambda (u^T v) = u^T M v = \mu (u^T v) $$

by combining $M$ either with the left or the right eigenvector, which implies that $u^T v$ must be zero, i.e. $(\lambda - \mu)(u^T v) = 0$ but the first factor is nonzero.

Therefore if the left eigenvectors $\{u_1,u_2,\ldots,u_\ell\}$ are ordered so that $u_j^T M = \lambda_j u_j^T$ to match the order of right eigenvectors, then taking "inner products" gives the coefficients:

$$ u_j^T x = \sum_{k=1}^\ell u_j^T (c_k v_k) = c_j (u_j^T v_j) $$

because all the omitted terms in the summation are zero. Thus:

$$ c_j = \frac{u_j^T x}{u_j^T v_j} $$


Added:

Since this post was motivated to an extent by concern that the inverse of the matrix $B$ whose columns are (the known basis of) right eigenvectors might be difficult to compute, it's worth noting what the existence of the basis of left eigenvectors says about forming that inverse.

Let $A$ be the $\ell \times \ell$ matrix whose rows are left eigenvectors in a compatible eigenvalue order as the right eigenvectors (columns of $B$). In terms of the notation used above:

$$ A = \begin{pmatrix} u_1^T \\ u_2^T \\ \vdots \\ u_\ell^T \end{pmatrix} \; , \; B = \begin{pmatrix} v_1 & v_2 & \cdots & v_\ell \end{pmatrix} $$

As shown above, $AB$ is a diagonal matrix $D_{jj} = u_j^T v_j$, explaining the comment of Robert Israel. Since $A,B$ are invertible (being of rank $\ell$), $D$ also is invertible. Thus:

$$ B^{-1} = D^{-1} A$$

The computation of $B^{-1}$ is thus tractable once the left eigenvalue/eigenvector pairs are known. From this we can conclude that $M$ is diagonalized by the similarity transformation $B^{-1} M B$.