Show that there are constants $K$ and $\alpha$ such that $|(e^{At})_{ij}|\leq e^{-\alpha t}K$.

90 Views Asked by At

I want to prove that if all eigenvalues of $\textbf{A}$ in the sytem $\dot{\textbf{x}}=\textbf{Ax}$ have negative real parts then there exist constants $K$ and $\alpha$ such that $$|(e^{\textbf{A}t})_{ij}|\leq e^{-\alpha t}K$$ for $1\leq i,j\leq n.$ This has been my proof so far: \begin{align*} e^{\textbf{A}t}&=\textbf{X(t)X(0)}^{-1}\text{ where $\textbf{X}$ is the fundamental matrix of solutions,}\\ &=\begin{pmatrix} e^{\lambda_1t}x_{11}(t)&\cdots&e^{\lambda_nt}x_{1n}(t)\\ \vdots&\ddots&\vdots\\ e^{\lambda_1t}x_{n1}(t)&\cdots&e^{\lambda_nt}x_{nn}(t) \end{pmatrix}\begin{pmatrix} a_{11}&\cdots&a_{1n}\\ \vdots&\ddots&\vdots\\ a_{n1}&\cdots&a_{nn} \end{pmatrix}. \end{align*} Thus, every component of $e^{\textbf{A}t}$ is of the form \begin{align*} (e^{\textbf{A}t})_{ij}=a_{1j}e^{\lambda_1t}x_{i1}(t)+\cdots+a_{nj}e^{\lambda_nt}x_{in}(t). \end{align*} So, \begin{align*}|(e^{\textbf{A}t})_{ij}|&=|a_{1j}e^{\lambda_1t}x_{i1}(t)+\cdots+a_{nj}e^{\lambda_nt}x_{in}(t)|\\ &=|a_{1j}e^{(-p_1+iq_1)t}x_{i1}(t)+\cdots+a_{nj}e^{(-p_n+iq_n)t}x_{in}(t)|\\ &=|a_{1j}e^{-p_1t}e^{iq_1t}x_{i1}(t)+\cdots+a_{nj}e^{-p_nt}e^{iq_nt}x_{in}(t)|\\ &\leq e^{-\alpha t}|a_{1j}e^{iq_1t}x_{i1}(t)+\cdots+a_{nj}e^{iq_nt}x_{in}(t)| \end{align*} where $-\alpha=\min\{-p_1,\ldots,-p_n\}$. From this point, I am stuck at finding $K$. I'd appreciate if someone could help me figure this out and improve my proof.

2

There are 2 best solutions below

3
On

If I apply the triangle inequality and use the fact that $\left|e^{iqt}\right|=1$, then the right side of the equation becomes $$e^{-\alpha t}\underline{(\left|a_{1j}x_{i1}(t)\right|+\cdots+\left|a_{nj}x_{in}(t)\right|)}$$. However, I am not certain if the underlined part is the constant $K$ that I am looking for.

0
On

Here a somewhat different approach. It gives a bound on the norm $\parallel \exp [\mathbf{A}t]\parallel $ of $\exp [\mathbf{A}t]$ and hence $\parallel \mathbf{x}(t)\parallel $ rather than on individual matrix-elements.

I gather from the context that $\mathbf{A}$ is an $n\times n$-matrix and $% \mathbf{x}\in \mathbb{C}^{n}$. Then we can equip $\mathbb{C}^{n}$ with the standard inner product $<\mathbf{x,y}>=\sum_{j=1}^{n}\overline{x_{j}}y_{j}$ and norm squared $\parallel \mathbf{x}\parallel ^{2}=\sum_{j=1}^{n}\overline{ x_{j}}x_{j}$. Next we write \begin{equation*} \mathbf{A}=\sum_{j=1}^{n}\lambda _{j}|\mathbf{u}_{j}><\mathbf{v}_{j}|, \end{equation*} where $\{|\mathbf{u}_{j}><\mathbf{v}_{j}|\}$ is a bi-orthogonal basis $\ $($< \mathbf{v}_{j}|\mathbf{u}_{h}>=\delta _{jh}$). Now we expand and estimate \begin{eqnarray*} \exp [\mathbf{A}t] &=&\sum_{j=1}^{n}\exp [\lambda _{j}t]|\mathbf{u}_{j}>< \mathbf{v}_{j}| \\ &\parallel &\exp [\mathbf{A}t]\parallel \leqslant \sum_{j=1}^{n}|\exp [\lambda _{j}t]|\parallel |\mathbf{u}_{j}><\mathbf{v}_{j}|\parallel \leqslant \sum_{j=1}^{n}\exp [{Re}\lambda _{j}t]\leqslant n\exp [-{ Re}|\lambda _{0}|t] \end{eqnarray*} where $\lambda _{0}$ is the eigenvalue with real part closest to $0$. Then \begin{eqnarray*} \mathbf{x}(t) &=&\exp [\mathbf{A}t]\cdot \mathbf{x}(0) \\ &\parallel &\mathbf{x}(t)\parallel =\parallel \exp [\mathbf{A}t]\cdot \mathbf{x}(0)\parallel \leqslant n\exp [-{Re}|\lambda _{0}|t]\parallel \mathbf{x}(0)\parallel \end{eqnarray*} Here $\parallel \mathbf{F}\parallel $ is the sup-norm of the operator $ \mathbf{F}$ (in this case an $n\times n$-matrix) \begin{equation*} \parallel \mathbf{F}\parallel =\sup_{\parallel \mathbf{x}\parallel =1}\parallel \mathbf{F\cdot x}\parallel \end{equation*}