Eigenvalues of matrix exponential under multiplicity

291 Views Asked by At

Suppose $A$ is a possibly complex square $n \times n$ matrix. Then $A$ has, including multiplicity, $n$ eigenvalues $\lambda_1, \dots, \lambda_n$. The matrix exponential $e^{A}$, also an $n \times n$ matrix has eigenvalues $\mu_1, \dots, \mu_n$. It is clear, following the definition of the matrix exponential and its absolute convergence, that $e^{\lambda_i}$ are included among $\mu_1, \dots, \mu_n$.

However, is it also true that the multiplicities are the same? In other words, are $(\mu_i)$ and $(e^{\lambda_i})$ (slightly abusing notation here), equal as multisets? If so, how can we see that? I guess part of the work is to show that for each $j$ there is some $i$ such that $\mu_j = e^{\lambda_i}$.

2

There are 2 best solutions below

0
On BEST ANSWER

Quoting my comment:

Note that if you only want the algebraic multiplicities then it "has to be true" that the multiplicities are the same, because 1) they're the same if the matrix is diagonalizable, and and 2) eigenvalues are a continuous function of the matrix, and every matrix can be approximated by diagonalizable matrices.

This argument can be made fully rigorous as follows.

Step 1: Show that the algebraic multiplicities match for diagonalizable matrices, by computing that the exponential of a diagonal matrix $\text{diag} \{ \lambda_i \}$ is another diagonal matrix $\text{diag} \{ \exp(\lambda_i) \}$.

Step 2: Show that the characteristic polynomial $\det(\lambda I - \exp(M))$ of the exponential of a matrix $M$ is a continuous function of it (since it's a composition of two continuous functions). This says, concretely, that each coefficient of the characteristic polynomial is a continuous (and even smooth, and even analytic) function of the matrix entries of $M$.

Step 3: Show that a matrix $M$ can be approximated by a sequence of diagonalizable matrices $D_i$ whose eigenvalues converge to those of $M$. This can be done most explicitly by using the Jordan normal form of $M$ and perturbing its diagonal entries slightly so that they are all distinct.

Step 4: Approximate an arbitrary matrix $M$ by diagonalizable matrices $D_i$ as above and use the continuity of the characteristic polynomial to argue that, since the characteristic polynomials $\det(\lambda I - \exp(D_i))$ converge to $\det(\lambda I - \exp(M))$ and since, by construction, the eigenvalues of the $D_i$ converge to those of $M$, it follows that $\det(\lambda I - \exp(M)) = \prod_{i=1}^n (\lambda - \exp(\lambda_i))$ as desired.

The point of writing the argument in this slightly indirect way is to avoid having to prove that the roots of a polynomial are a continuous function of its coefficients; since the roots don't come in a prescribed order, especially over $\mathbb{C}$, it's a little tricky to make this rigorous (although it can be made rigorous and it is true), and so it's easier to just argue with the characteristic polynomial instead.

This density argument is a powerful technique and makes precise the intuition that statements like this "have to be true" because they're true for diagonalizable matrices. The same or similar density arguments can prove lots of other facts, for example that if $A, B$ are two square matrices then $AB$ and $BA$ have the same characteristic polynomial (note that they aren't necessarily similar).

This particular density argument shows more generally that if $f$ is a holomorphic function then the eigenvalues of $f(M)$ are $f$ applied to the eigenvalues of $M$, with matching algebraic multiplicities; this is a slight enhancement of the spectral mapping theorem for the holomorphic functional calculus.

3
On

Following the hint from Exodd, recall that $A$ is similar to a Jordan matrix $$J = \bigoplus_{k=1}^{p} J_{n_k}(\lambda_k).$$ Above, $J_{n_k}(\lambda_k)$ denotes an $n_k \times n_k$-Jordan block corresponding to eigenvalue $\lambda_k$. Here $p$ is the number of Jordan blocks.

By induction, one can check that for nonnegative integers $t$, the matrix $A^t$ is similar (under the same similarity transformation) to $$ J^t = \bigoplus_{k=1}^{p} J_{n_k}^t(\lambda_k). $$ And consequently if $T$ denotes the similarity transformation: $$ \exp(A) = \sum_{t\geq 0} \frac{TJ^tT^{-1}}{t!} = T \exp(J) T^{-1}.$$ Stated differently, $e^{A}$ is similar to $e^J$. Moreover, by direct calculation it is possible to check that: $$ \exp(J) = \bigoplus_{k=1}^p \exp(J_{n_k}(\lambda_k)) = \bigoplus_{k=1}^p e^{\lambda_k} M_{n_k}, $$ where above for positive integers $n$ we define $M_{n}$ by: $$ M_{n} = \begin{bmatrix} 1 & \tfrac{1}{1!} & \cdots & \tfrac{1}{(n-2)!} & \tfrac{1}{(n-1)!} \\ 0 & 1 & \cdots & \tfrac{1}{(n-3)!} & \tfrac{1}{(n-2)!} \\ 0 & 0 & 1 & \cdots & \tfrac{1}{(n-3)!} \\ \vdots & & \ddots & \ddots & \vdots\\ 0 & \cdots & \cdots & 0 & 1 \\ \end{bmatrix} $$ Therefore, we see that $e^A$ is similar to a (block) upper triangular matrix with diagonal: $$ (\underbrace{e^{\lambda_1}, \dots, e^{\lambda_1}}_{\mbox{$n_1$ times}}, \underbrace{e^{\lambda_2}, \dots, e^{\lambda_2}}_{\mbox{$n_2$ times}}, \ldots, \underbrace{e^{\lambda_p}, \dots, e^{\lambda_p}}_{\mbox{$n_p$ times}}) $$ And so the claim about multiplicity follows.