Finding a kind of matrix

50 Views Asked by At

Given $h > 0$, let

$$B := \begin{pmatrix} 0 & -2 \\ \frac{1}{2h} & 0 \end{pmatrix}$$

or, more generally, let

$$B := \begin{pmatrix} 0^{n\times n} & -2 I^{n\times n} \\ \frac{1}{2h}I^{n\times n} & 0^{n\times n} \end{pmatrix}$$

Is there an explicit form for $e^{B}$?

2

There are 2 best solutions below

0
On

If $\bar{\textbf{x}}(t)' = B\bar{\textbf{x}}(t) $ then $ \bar{\textbf{x}}(t) = e^{tB}\begin{pmatrix} x_0\\ y_0 \end{pmatrix}$

It will be fairly easy to solve the ODEs $x' = -2y$ and $y' = \frac{1}{h}x$ Then from this you will have a solution

0
On

by blocked multiplication
$B^2 = \left[\begin{matrix}(-2I_n)(\frac{1}{2h}I_n) & \mathbf{0}\\\mathbf{0} & (\frac{1}{2h}I_n)(-2I_n)\end{matrix}\right] = \frac{-1}{h} I_{2n}$

which means that $B$ has two distinct eigenvalues given by the two distinct solutions to $x^2 = \frac{-1}{h}$.

This also tells you that $\mathbf 0 = B^2+\frac{1}{h} I_{2n} =\big(B-\lambda_1 I \big)\big(B-\lambda_2 I \big)$ is the minimal polynomial of $B$. This implies $B$ is diagonalizable over $\mathbb C$ so that is one way to finish. A nicer finish is below:

Treated as abstract vectors, $\Big\{I,B\Big\}$ form a basis for $B^k$ for natural numbers $k$. Or if you prefer a more concrete approach use the vec operator which stacks one column on top of the other to get a 'big coordinate vector' with $n^2$ components and consider $\Big\{\text{vec}\big(I\big),\text{vec}\big(B\big)\Big\}$ and $\text{vec}\big(B^k\big)$ may be written as a linear combination of those two independent vectors.

either way you approach it, let $v_k$ be the vector associated with the kth power of $B$. (Note: the zero'th power of $B$ is the identity matrix).
from here use the Companion matrix to model the recurrence
$C := \begin{bmatrix} 0 & -a_o\\ 1 & -a_{1} \end{bmatrix} = \begin{bmatrix} 0 & -\frac{1}{h}\\ 1 & 0 \end{bmatrix}$

and working through the recurrence (with $\mathbf e_1$ being the $2\times 2$ first standard basis vector), for $k=0,1,2,3,...$

$\bigg[\begin{array}{c|c} v_0 & v_1 \end{array}\bigg]\mathbf C^k\mathbf e_1 = \bigg[\begin{array}{c|c} v_k & v_{k+1} \end{array}\bigg]\mathbf e_1 = v_k$

Thus
$$ \begin{align} &e^B\\ &=\sum_{k=0}^\infty\frac{1}{k!} B^k\\ &=\sum_{k=0}^\infty\frac{1}{k!}\big( v_k\big)\\ &=\sum_{k=0}^\infty\frac{1}{k!}\Big( \bigg[\begin{array}{c|c} v_0 & v_1 \end{array}\bigg] C^k\mathbf e_1 \Big)\\ &=\bigg[\begin{array}{c|c} v_0 & v_1 \end{array}\bigg]\Big(\sum_{k=0}^\infty\frac{1}{k!} C^k \Big)\mathbf e_1\\ &=\bigg[\begin{array}{c|c} v_0 & v_1 \end{array}\bigg]\big(e^C\big)\mathbf e_1\\ &=\bigg[\begin{array}{c|c} v_0 & v_1 \end{array}\bigg]\big(e^C\mathbf e_1\big)\\ \end{align} $$

So you want to compute $e^C$ then 'grab' its first column. There are various ways to do this. As mentioned in the comments, there's a closed form approach for the matrix exponential of a $2\times 2$ matrix. Alternatively you can directly diagonalize $C$ which is easy since you know it has distinct eigenvalues $\big\{\lambda_1,\lambda_2\big\}$ and the Vandermonde matrix diagonalizes it. You can also exploit the known form of the solution. But the point is you have an $2n\times 2n$ matrix that is diagonalizable and has a degree 2 minimal polynomial. So rather than directly computing the matrix exponential of a $2n\times 2n$ matrix, the problem reduces to computing the matrix exponential of a nice $2\times 2$ matrix.