Show that $e^{tA} = \sum\limits_{k=0}^{n-1} f_k(t)A^k$

144 Views Asked by At

Let $A$ be a $n\times n$ matrix such that the characteristic polynomial of $A$ is $$P(\lambda)=\lambda^n+a_{n-1}\lambda^{n-1}+...+a_1\lambda+a_0$$

Now consider the nth order differential equation $$\frac{d^nx}{dt^n}+a_{n-1}\frac{d^{n-1}x}{dt^{n-1}}+...+a_1\frac{dx}{dt}+a_0x=0$$ and let $f_0(t),\: f_1(t),\: ...,\: f_{n-1}(t)$ be the unique solutions satisfying the inital conditions $$f_j^l(0)= \begin{cases} 1 & j=l \\ 0 & j \neq l\end{cases}$$ for $0 \leq j,\: l\leq n-1$

Show that $e^{tA} = f_0(t)I+f_1(t)A+f_2(t)A^2+...+f_{n-1}(t)A^{n-1}$

I know that $e^{tA}$ is the function $M(t)$ such that $\frac{dM}{dt}=AM$ and $M(0) = I$. So all I need to do is show that the right hand side of the equation satisfies these two conditions. The second one is easy enough to prove; however, the first one presents some trouble. My plan is to show that $\frac{d^n}{dt^n}f_j(t) = A^nf_j(t)$ for every unique solution $f_j$, and based on the definitions of $A$ and the solutions to the differential equations, I suspect that Cayley-Hamilton theorem could be used here. However, I haven't been able to engineer it in a way to prove the above statement.

2

There are 2 best solutions below

3
On

Hint (but not a full answer): Using the definition of the matrix exponential, we write $$ e^{tA} = \sum_{k=0}^\infty \frac{1}{k!}(tA)^k . $$ Then, using the Cayley-Hamilton theorem, we have $P(A) = 0$. Hence, all powers of $A$ with exponent larger or equal to $n$ can be expressed recursively as linear functions of $I, \dots, A^{n-1}$: $$ \begin{aligned} A^n &= -a_{n-1}A^{n-1} - \dots - a_1 A - a_0 I \\ A^{n+1} &= -a_{n-1}A^{n} - \dots - a_1 A^2 - a_0 A \\ &\;\; \vdots \end{aligned} $$ which we may be written $$ A^k = \sum_{i=0}^{n-1} b_i^{(k)} A^i , \qquad k\geq n. $$ Finally, $$ \begin{aligned} e^{tA} &= \sum_{k=0}^{n-1} \frac{1}{k!}(tA)^k + \sum_{k=n}^\infty \frac{1}{k!}(tA)^k \\ &= \sum_{k=0}^{n-1} \frac{1}{k!}(tA)^k + \sum_{k=n}^{\infty} \frac{1}{k!}t^k \sum_{i=0}^{n-1} b_i^{(k)} A^i \\ &= \sum_{k=0}^{n-1} \left( \frac{1}{k!}t^k + \sum_{p=n}^{\infty} \frac{1}{p!}t^p b_k^{(p)} \right) A^k \end{aligned} $$ which is of the expected form.

1
On

In the space of matrices $c(A)=c_0I+c_1A+...+c_{n-1}A^{n-1}$, the multiplication with $A$ results in $$ A\,c(A)=c_0A+c_1A^2+...+c_{n-2}A^{n-1}-c_{n-1}(a_0I+a_1A+...+a_{n-1}A^{n-1}) \\ =-a_0c_{n-1}I+(c_0-a_1c_{n-1})A+...+(c_{n-2}-a_{n-1}c_{n-1})A^{n-1} $$ This proves that the algebra generated by $A$ is spanned by the powers $I,A,...,A^{n-1}$. This means that there are function coefficients $f_0(t),...,f_{n-1}(t)$ with $e^{tA}=f_0(t)I+f_1(t)A+...+f_{n-1}(t)A^{n-1}$. From the differential equation $(e^{tA})'=Ae^{tA}$ we get thus \begin{align} f_0'(t)&=-a_0f_{n-1}(t)\\ f_1'(t)&=f_0(t)-a_1f_{n-1}(t)\\ &~~\vdots\\ f_{n-2}'(t)&=f_{n-3}(t)-a_{n-2}f_{n-1}(t)\\ f_{n-1}'(t)&=f_{n-2}(t)-a_{n-1}f_{n-1}(t) \end{align} Inserting backwards one finds $$ 0=f_{n-1}^{(n)}(t)+a_{n-1}f_{n-1}^{(n-1)}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t) $$ with $0=f_{n-1}(0)=f_{n-1}'(0)=f_{n-1}^{(n-2)}(0)$ and $f_{n-1}^{(n-1)}(0)=1$.

Then conclude on the properties of $f_{n-2}(t)=f_{n-1}'(t)+a_{n-1}f_{n-1}(t)$ which is a solution of the same differential equation with $f_{n-2}^{(k)}(0)=0$ for $k<n-2$, $f_{n-2}^{(n-2)}(0)=1$ and $f_{n-2}^{(n-1)}(0)=f_{n-1}^{(n)}(0)+a_{n-1}=0$ and so on.


In other terms, assume that $P$ is not only the characteristic, but also the minimal polynomial of $A$. Set $$Q(x,y)=\frac{P(x)-P(y)}{x-y},$$ $Q$ is a polynomial, so that $$P(D)I-P(A)=Q(D,A)(DI-A)$$ for the differentiation operator $D=\frac{d}{dt}$ and identity matrix $I$. As $P(A)=0$ per Cayley-Hamilton, we get that any solution of $Dx=Ax$ also satisfies $P(D)x=0$.

Thus if $M(t)=\sum_{k=0}^{n-1} f_k(t)A^k$ is a matrix solution, we get $\sum_{k=0}^{n-1}P(D)f_k(t)\,A^k=0$, and as these powers of $A$ are linearly independent, $$P(D)f_k(t)=0$$ separately.

$M(0)=I$ implies $f_k(0)=\delta_{0,k}$, $M^{(j)}(0)=A^j$ similarly implies $f_k^{(j)}(0)=\delta_{j,k}$.

Now argue that $Q(D,A)$ is invertible on the solution space of the ODE system $Dx-Ax$ by the minimality of $P$ to find that a so constructed function $M$ reversely also satisfies $DM=AM$. Then use continuation from a dense set to also include matrices $A$ without the minimality of the characteristic polynomial.