How do you solve $\dot{X} = UX$?

223 Views Asked by At

Let $X(t) = \left[\begin{matrix}a(t) & b(t) \\c(t) & d(t)\end{matrix}\right]$ and let $U$ be a nonsingular matrix. How do you solve $$\frac{d}{dt} X(t)=UX(t)$$ I presume there is some general method to solve these kinds of ODEs but I cannot find anything about it, online.

BTW: -----------------

I know that you can get 4 ODE equations for the four unknown functions of $t$. The problem is that each ODE equation includes other functions as so $$\frac{d a}{d t}=u_{11}a+u_{12}c$$

4

There are 4 best solutions below

3
On BEST ANSWER

Hint:

The solution is known to be $$X(t)=\exp\bigl(U(t)\bigr) X(0),$$ so you have to compute $\exp\bigl(U(t)\bigr)$. This supposes you can write $U(t)$ as the sum $D(t)+N(t)$, where $D(t)$ is a diagonal matrix and $N(t)$ a nilpotent matrix, which commute (Jordan-Chevalley decomposition).

0
On

The general solution is

$$X(t)=e^{Ut}X(0)$$

0
On

You are correct, it can be solved column-wise, and in analogy to the scalar form of this differential equation. The solution is $$X(t)=\exp\left(U t\right) X(0)$$ where $X(0)$ is the initial condition (matrix) and $$\exp(Ut)=\sum_{n=0}^\infty \frac{(Ut)^n}{n!}$$ is the matrix exponential, which can be computed explicitly if eigenvectors and eigenvalues are known. Let $U=W \Lambda V^T$ with diagonal matrix $\Lambda$ and left and right eigenvector matrices $V$ and $W$. Then, $$\exp(Ut)=W \exp(\Lambda t) V^T$$ where $$\Lambda = \textrm{diag}(\lambda_1 ,\ldots,\lambda_N )$$ is the diagonal matrix of the $N$ eigenvalues of $U$ $$\exp(\Lambda t) = \textrm{diag}(\exp(\lambda_1 t),\ldots,\exp(\lambda_N t))$$ If you assume the functions $a(t)$, $b(t)$ etc. to be scalars, you have $N=2$ and can compute the eigenvalues from the trace and determinant of $U$, $\lambda_1+\lambda_2=U_{11}+U_{22}$ and $\lambda_1\lambda_2=U_{11}U_{22}-U_{21}U_{12}$.

0
On

I assume $U$ is a constant matrix, i.e., does not depend on $t$.

We need some initial conditions. So suppose

$X(t_0) = \begin{bmatrix} a(t_0) & b(t_0) \\ c(t_0) & d(t_0) \end{bmatrix}; \tag 1$

then the general solution of

$\dot X = UX \tag 2$

is

$X(t) = e^{(t - t_0) U} X(t_0) = \exp ((t - t_0)U) X(t_0); \tag 3$

we may validate (2) via direct differentiation, using the fact that

$\dfrac{d \exp((t - t_0)U)}{dt} = U \exp((t - t_0)U); \tag 4$

then from (3) and (4),

$\dot X = \dfrac{d \exp((t - t_0)U)}{dt} X(t_0) = U \exp((t - t_0)U) X(t_0) = UX, \tag 5$

which validates the assertion that (3) is the solution to (2); at $t = t_0$ we have

$\exp((t_0 - t_0)U)X(t_0) = \exp(0) X(t_0) = IX(t_0) = X(t_0), \tag 6$

showing that our solution is consistent with the initial conditions on $X(t)$.

(4)-(6) demonstrate that our proposed solution (3) in fact works, but offer little insight as to how one might arrive at (3) other than some intelligent guesswork. We may, however, derive (3) from (2) as follows: differentiating (2) we find

$\ddot X = U \dot X = U(UX) = U^2 X; \tag 7$

$\dddot X = U^2 \dot X = U^2 UX = U^3X; \tag 8$

if we assume that, for some positive integer $k$,

$X^{(k)} = \dfrac{d^k X}{dt^k} = U^k X, \tag 9$

then, differentiating,

$X^{(k + 1)} = U^k \dot X = U^k UX = U^{k + 1} X, \tag{10}$

which inductively shows that (9) holds for all integers $n \ge 0$; if we now construct the power series expansion of $X(t)$ about $t = t_0$ we find

$X(t) = \displaystyle \sum_0^\infty \dfrac{X^{(n)}(t_0)(t - t_0)^n}{n!} = \sum_0^\infty \dfrac{U^n X(t_0)(t - t_0)^n}{n!}$ $= \left ( \displaystyle \sum_0^\infty \dfrac{U^n (t - t_0)^n}{n!} \right ) X(t_0) = \exp((t - t_0)U)X(t_0). \tag{11}$

in (7)-(11) we have presented a derivation of the solution (3), which to my mind is an advance on the mere verification presented in (4)-(6).