Decompose initial value problem into semi-simple and nilpotent matrices

228 Views Asked by At

Consider the initial value problem:

$\begin{pmatrix} \dot x(t) \\ \dot y(t) \\ \dot z(t) \end{pmatrix}= A \vec x(t) = \begin{pmatrix} -1 & 1 & 1 \\ 0 & -1 & 4 \\ 0 & 0 & 1 \end{pmatrix}\begin{pmatrix} x(t) \\ y(t) \\ z(t) \end{pmatrix}, \: \: \: \begin{pmatrix} x(0) \\ y(0) \\ z(0) \end{pmatrix}=\begin{pmatrix} x_0 \\ y_0 \\ z_0 \end{pmatrix}$

(i) Determine the eigenvalues and generalised eigenvectors of $A$.

(ii) Decompose $A$ into a semisimple matrix $S$ and a nipotent matrix $N$ such that $A=S+N$.

(a) first determine the semisimple part $S$ of $A$.

(b) then determine the nilpotent part $N$ of $A$ and show that $N^2=0$.

My attempt:

Since $A$ is an upper triangular matrix we have eigenvalues $\lambda=\{-1,-1,1\}$

Taking $\lambda_3=1, (A-\lambda I)v=\begin{pmatrix} -2 & 1 & 1 \\ 0 & -2 & 4 \\ 0 & 0 & 0 \end{pmatrix}\begin{pmatrix} x \\ y \\ z \end{pmatrix}=\begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix}$

So $-2y+4z=0$ and $-2x+y+z=0$

and I've ended up with eigenvector $v_3=(3,4,2)^T$

I get the feeling this is wrong because I tried the rest of the question and don't get $N^2=0$

Any help?

2

There are 2 best solutions below

0
On BEST ANSWER

Consider the initial value problem:

$$\vec x'(t) = A \vec x= \begin{pmatrix} -1 & 1 & 1 \\ 0 & -1 & 4 \\ 0 & 0 & 1 \end{pmatrix}\begin{pmatrix} x \\ y \\ z \end{pmatrix}, ~~~~\vec x(0) =\begin{pmatrix} x_0 \\ y_0 \\ z_0 \end{pmatrix}$$

Since $A$ is upper triangular, the two eigenvalues can be read off the main diagonal as $\lambda_{1,2} = -1, 1$.

$\lambda_1 = -1$ has multiplicity $n_1 = 2$ and $\lambda_2 = 1$ has multiplicity $n_2 = 1$.

The generalized eigenspace associated with $\lambda_1$ is $E_1 = \ker(A - \lambda_1I)^2 = \ker(A + I)^2$. Find

$$(A + I)^2 = \begin{pmatrix} 0 & 0 & 6 \\ 0 & 0 & 8 \\ 0 & 0 & 4 \\ \end{pmatrix}$$

A choice for generalized eigenvectors spanning $E_1$ is $v_1 = (1,0,0)^T$ and $v_2 = (0,1,0)^T$.

The generalized eigenspace associated with $\lambda_2$ is $E_2 = \ker(A-I)$. Find

$$(A - I) = \begin{pmatrix} -2 & 1 & 1 \\ 0 & -2 & 4 \\ 0 & 0 & 0 \\ \end{pmatrix}$$

Let $v_3 = (3, 4, 2)^T$. The transformation matrix is

$$P = (~v_1~|~v_2~|~v_3~) = \begin{pmatrix} 1 & 0 & 3 \\ 0 & 1 & 4 \\ 0 & 0 & 2 \\ \end{pmatrix} ~\mbox{and}~ P^{-1} = \begin{pmatrix} 1 & 0 & -\dfrac{3}{2} \\ 0 & 1 & -2 \\ 0 & 0 & \dfrac{1}{2} \\ \end{pmatrix}$$

We are now ready to find $S$ and $N$ using $A = S + N$.

$$S = P\Lambda P^{-1}, ~\mbox{where}~\Lambda = \mbox{diag}(-1,-1,1)$$

Obtain $S = \begin{pmatrix} -1 & 0 & 3 \\ 0 & -1 & 4 \\ 0 & 0 & 1 \\ \end{pmatrix}$ and $N = A - S = \begin{pmatrix} 0 & 1 & -2 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \\ \end{pmatrix}$. It's easy to verify that $N^2 = 0$.

The matrix exponential is given by $$e^{tA} = e^{t(S+N)} = e^{tS}e^{tN} = Pe^{t\Lambda}P^{-1} \sum_{k=0}^{m-1}\dfrac{{tN}^k}{k!}$$

Since $N^2$ yields a zero vector, all the higher terms in the matrix exponential will be zero. We have $e^{t\Lambda} = \mbox{diag}(e^{-t},e^{-t},e^t)$ and

$$e^{t A} = e^{t(S+N)} = e^{tS}e^{tN} = P e^{t \Lambda}P^{-1}(I + t N) = \begin{pmatrix} e^{-t} & t e^{-t} & -2 t e^{-t}+\dfrac{3 e^t}{2}-\dfrac{3 e^{-t}}{2} ~~\\ 0 & e^{-t} & -2 e^{-t}+2 e^t ~~\\ 0 & 0 & e^t \\ \end{pmatrix}$$

The solution of the initial value problem is

$$x(t) = e^{t A} x_0$$

4
On

This became an answer since i could not paste code in the comment. There are computer algebra systems that help with explicit computations in such cases. For me, it was simplest to see where is the ship navigating to, as i saw the examples. This makes dry theory simple to have first in a visual pattern. (Of course, efforts to understand the structure must follow.)

Sage gives us for instance:

sage: A = matrix( QQ, 3, 3, [-1,1,1, 0,-1,4, 0,0,1] )
sage: J, T = A.jordan_form(transformation=True)
sage: J
[ 1| 0  0]
[--+-----]
[ 0|-1  1]
[ 0| 0 -1]
sage: T
[  1   1   0]
[4/3   0   1]
[2/3   0   0]

The Jordan form is the above J, which comes i two separated blocks. The upper left corner is a $1\times 1$ block corresponding to the eigenvalue $1$. The corresponding eigenvector is the first column in T. Now, for the eigenvalue $-1$, the eigenspace has only dimension one, the eigenvector $v=v_{-1}$, its choice, is in the second column. We have $(A-(-1)I)v=(A+I)v=0$. To have a basis, we search for a vector in the "generalized eigenspace", the space of vectors annihilated by $(A+I)$, $(A+I)^2$ (and in bigger dimension / for blocks of bigger dimension - also the next powers of $(A+I)$). So we solve then $(A+I)v'=v$. (And in bigger... also $(A+I)v''=v'$ and so on.) In this way, we get the third column in T.

Now we have the decompositions:

sage: T.inverse()*A*T == J
True
sage: T*J*T.inverse() == A
True

Mathematical human transcription: $$ \underbrace{ \left[ \begin{array}{rrr} -1 & 1 & 1 \\ 0 & -1 & 4 \\ 0 & 0 & 1 \end{array}\right] }_{A} = \underbrace{ \left[ \begin{array}{r|rr} 1 & 0 & 0 \\ \hline 0 & -1 & 1 \\ 0 & 0 & -1 \end{array} \right]^{-1} }_{T^{-1}} \ \cdot \ \underbrace{ \left[ \begin{array}{r|rr} 1 & 0 & 0 \\ \hline 0 & -1 & 1 \\ 0 & 0 & -1 \end{array} \right] }_{J} \ \cdot \ \underbrace{ \left[ \begin{array}{rrr} 1 & 1 & 0 \\ \frac{4}{3} & 0 & 1 \\ \frac{2}{3} & 0 & 0 \end{array} \right] }_{T} $$ Here, let $$ \begin{aligned} S' &= \left[ \begin{array}{r|rr} 1 & 0 & 0 \\ \hline 0 & -1 & \boxed{0} \\ 0 & 0 & -1 \end{array} \right] \ , \\ N' &= \left[ \begin{array}{r|rr} 0 & 0 & 0 \\ \hline 0 & 0 & \boxed{1} \\ 0 & 0 & 0 \end{array} \right] \ . \end{aligned} $$ Then the split $J = S'+N'$ (with a diagonal $S$ and a nilpotent $N$) induces by conjugation with $T$, parallely to $A=TJT^{-1}$, $$ \begin{aligned} S &= TS'T^{-1}\ ,\\ N &= TN'T^{-1}\ , \end{aligned} $$ and now $A=S+N$. This too explicit representation of $A$ is in fact not / never needed. I suppose the course afferent to the problem was giving an ad-hoc way to get these $S,N$ first for $A$, then without any mention of the Jordan normal form pass to the solution of the system of differential equations we are mainly interested in. This is not the structural way to do things, but yes, i would also do it to save time for a specific purpose. (And the students with interest in pure mathematics and/or linear algebar will investigate the structure deeper...)

The solution is now simple, since $$ \begin{aligned} A &= T^{-1}JT\ ,\\ \exp (tA) &= T^{-1}\exp(tJ)T\ ,\\ \exp(tJ) &= \left[ \begin{array}{rrr} e^{t} & 0 & 0 \\ 0 & e^{-t} & t e^{-t} \\ 0 & 0 & e^{-t} \end{array} \right]\ , \end{aligned} $$ and so on.


Note: In fact, Sage can compute:

sage: A = matrix( QQ, 3, 3, [-1,1,1, 0,-1,4, 0,0,1] )
sage: var('t');
sage: exp( t*A )
[ e^(-t)   t*e^(-t)   -1/2*(4*t - 3*e^(2*t) + 3)*e^(-t)]
[ 0        e^(-t)                2*(e^(2*t) - 1)*e^(-t)]
[ 0        0                                        e^t]

Later EDIT:

Sorry, the above matrices $S'$, $N'$ were originally $S$, $N$.

I gave the solution involving the Jordan normal form, since i do not know the ad-hoc way to get $S,N$ in a rapid way.

Also, it seems simpler for me to use and compute the exponential for the Jordan blocks, because for a nilpotent $N$, $\exp(\lambda I+N)=\exp(\lambda I)\exp(N)$ - $\lambda I$ and $N$ commute, and now it is most simple to compute $\exp N$ for $N$ already in the form $$ N=\begin{bmatrix} 0 &1 &0 &0 &0 &\cdots &0\\ 0 &0 &1 &0 &0 &\cdots &0\\ 0 &0 &0 &1 &0 &\cdots &0\\ 0 &0 &0 &0 &1 &\cdots &0\\ 0 &0 &0 &0 &0 &\cdots &0\\ \vdots &\vdots &\vdots &\vdots &\vdots &\ddots &1\\ 0 &0 &0 &0 &0 &\cdots &0\\ \end{bmatrix} $$ when $$ \exp (tN)=\begin{bmatrix} 1 & t & \frac 1{2!}t^2 & \frac 1{3!}t^3 &\frac 1{4!}t^4 &\cdots &\frac 1{k!}t^k\\ 0 &1 &t &\frac 1{2!}t^2 &\frac 1{3!}t^3 &\cdots &\frac 1{(k-1)!}t^{k-1}\\ 0 &0 &1 &t &\frac 1{2!}t^2 &\cdots &\frac 1{(k-2)!}t^{k-2}\\ 0 &0 &0 &1 &t &\cdots &\frac 1{(k-3)!}t^{k-3}\\ 0 &0 &0 &0 &1 &\cdots &\frac 1{(k-3)!}t^{k-3}\\ \vdots &\vdots &\vdots &\vdots &\vdots &\ddots &t\\ 0 &0 &0 &0 &0 &\cdots &1 \end{bmatrix} $$ It is how i learned it looong time ago, and it is hard to get it out there.