How to compute time ordered Exponential?

6.9k Views Asked by At

So say you have a matrix dependent on a variable t:

$A(t)$

How do you compute $e^{A(t)}$ ?

It seems Sylvester's formula, my standard method of computing matrix exponentials can't be applied here given the varying nature of the matrix and furthermore the fact that it may not always have distinct eigenvalues.

3

There are 3 best solutions below

3
On

One example of where such ordered exponentials arise is in computing the solution to ODEs with time-dependent coefficients: $$ \frac{d\mathbf{x}(t)}{dt} = A(t) \mathbf{x}(t), $$ where $\mathbf{x}(t)=(x_1(t), \ldots, x_n(t))$ and $A(t)$ is an $n\times n $ matrix, the general solution is $$ \begin{equation} \mathbf{x}(t) = \mathcal{T}\{ e^{\int_0^{t} A(t') dt'}\} \mathbf{x}(0), \end{equation} $$ where $\mathcal{T}$ denotes time-ordering,

$\mathcal{T}\{ e^{\int_0^{t} A(t') dt'}\} \equiv \sum\limits_{n=0}^{\infty} \frac{1}{n!}\int_0^t \ldots \int_0^t \mathcal{T}\{A(t_1') \ldots A(t_{n}') \} = \sum\limits_{n=0}^{\infty} \int_0^t \ldots \int_0^{t_{n-1}'} A(t_1') \ldots A(t_{n}')$.

Only when the matrices at different times commute - i.e. $[A(t_1), A(t_2)]=0$, $\forall t_1,t_2$ - does the time-ordered expression simplify to $e^{\int_0^{t} A(t') dt'}$.

Evaluating such exponentials is an arduous task, since even for time-independent matrix exponentials there are articles such as the one mentioned in René's answer. A quick search in the literature revealed a few approaches. Perturbatively, the solution is given by a Magnus series, and there are various papers describing methods to approximate or exactly solve them:

The methods typically involve heavy maths and it would not be appropriate to describe them here with my limited understanding.

1
On

You have probably seen this before but in any case, the following reference explore some of the ways to calculate the matrix exponential 19 dubious ways ..!.

Hope it helps!

0
On

I note with astonishment that, once again, the current answers have only a vague relation to the asked question.

That follows is an example that answers the question "how to calculate $exp(A(t))$" ? An exact calculation is possible only if one knows explicitly the eigenvalues and eigenvectors as functions of $ t $, which is generally hopeless; it's even more infeasible if $ A (t) $ has multiple eigenvalues for some values of $t$ (because of the instability of the calculation of the Jordan forms). A numerical approximation can be obtained as follows; it's not the most efficient method but it's the simplest one.

Let $A(t)=\begin{pmatrix}\cos(t)&1+t^2-t&\sin(t)\\t^2-2t&\tan(t)&-t^3+1\\2t^2-1&\log(1+t)&t^4+1\end{pmatrix}$. We seek $\exp(A(t))$ for $t\in[0,1]$.

Step 1. We calculate for every $i=0,\cdots,10$, $exp(A(i/10))$.

Let $U=A(i/10)$; we calculate $exp(U)$ as follows; let $V=A/1024$;

then $exp(V)\approx R=I+V+\cdots+1/10!V^{10}$ and $\exp(U)$ is given by

for i from 1 to 10 do

R:=R^2:

od:

For example $\exp(A(0.6))[1,1]\approx 1.2717213247490905765$.

Step 2. For each $j,k$, we calculate a polynomial interpolation using the $\exp(A(i/10))[j,k]$.