The Operator Norm of Basis Change Matrices for a more explicit Gelfand Formula

55 Views Asked by At

Gelfand's formula states that the operator norm of a matrix exponent $A^n$ is in some sense equal to the exponent of the spectral radius $\rho(A)^n$ in the sense that $$ \lim_{n\to\infty} \sqrt[n]{\|A^n\|} = \inf_{n\in\mathbb{N}} \sqrt[n]{\|A^n\|} = \rho(A). $$ To obtain a more explicit version for my particular problem I used $$ A^n = UJ^nU^{-1} $$ where $U$ is a basis change matrix and $J$ is the jordan form. This resulted in $$ \|A^n\|\le \|U\|\|U^{-1}\| (n+\rho(A))\rho(A)^{n-1} $$ in my particular case. This is in some sense more explicit, but due to the constant in front it is not really.

So the question is whether or not there are methods to bound basis change matrices. I am mostly interested in 2 dimensional matrices.

Kozyakin (2009) showed something similar using the constant

$$ \frac{\|A\|^d}{\|A^d\|} $$

directly where $A$ is a $d\times d$ matrix, instead of using a basis change. So in my case $d=2$. It is very likely that there is no good bound for the general case. In that case I would be glad to hear about special cases which might be helpful.

The particular matrix I am interested in looks like this

$$ \begin{pmatrix} 0 & 1 \\ -\beta & 1+\beta - a \end{pmatrix} $$ where $0\le\beta\le1$ and $0\le a\le (1-\sqrt{\beta})^2$.

1

There are 1 best solutions below

0
On

Instead of using the Jordan decomposition one should use the Schur Decomposition in this case as Unitary matrices are isometries in euclidean norms and therefore isometries on the induced operator norms. This means we do not have to estimate them at all.

Lemma: Explicit Schur Decomposition for Specific Matrix

For a real matrix of the form

$$ R = \begin{pmatrix} 0 & 1 \\ -\beta & \xi \end{pmatrix}, $$ the unitary operator
$$ Q:=\frac{1}{\sqrt{1+|r_1|^2}} \begin{pmatrix} 1 & \overline{r}_1 \\ r_1 & -1 \end{pmatrix} $$ where $ r_{1/2} = \tfrac12 \left( \xi \pm \sqrt{\xi^2 - 4\beta} \right) $ are the eigenvalues of (R), results in the Schur Decomposition \begin{align*} Q^* R Q &=\begin{pmatrix} r_1 & -\frac{(1+\beta)(1+\overline{r}_1^2)}{1+|r_1|^2} \\ 0 & \frac{\beta\overline{r}_1 + r_2}{1+|r_1|^2} \end{pmatrix}. \end{align*} For complex eigenvalues, more specifically $4\beta\ge\xi^2$, we have \begin{align*} Q^* R Q &=\begin{pmatrix} r_1 & -(1+\overline{r}_1^2) \\ 0 & \overline{r}_1 \end{pmatrix}. \end{align*} Note that $r_1$ and $r_2$ could always be swapped to achieve a more favorable result.

Proof

Since \begin{align*} \frac{1}{\sqrt{1+|r_1|^2}} \begin{pmatrix} 1 \\ r_1 \end{pmatrix} \end{align*} is a normalized eigenvector for eigenvalue (r_1) of (R), there are only two options for an orthonormal second vector, which results in (Q).

Recall that any eigenvalue of $R$, in particular $r_1$ and $\overline{r}_1\in\{r_1, r_2\}$, is a root of the characteristic polynomial \begin{align}\label{eq: property of eigenvalues of R} \det (r\mathbb{I} - R) = r^2 - r\xi + \beta = 0. \end{align} Now we can calculate the Schur decomposition of $R$: \begin{align*} Q^* R Q &= \frac{1}{1+|r_1|^2} \begin{pmatrix} 1 & \overline{r}_1 \\ r_{1} & -1 \end{pmatrix} \begin{pmatrix} 0 & 1 \\ -\beta & \xi \end{pmatrix} \begin{pmatrix} 1 & \overline{r}_1 \\ r_1 & -1 \end{pmatrix}\\ &= \frac{1}{1+|r_{1}|^2} \begin{pmatrix} 1 & \overline{r}_1 \\ r_1 & -1 \end{pmatrix} \begin{pmatrix} r_1 & -1 \\ \smash{\underbrace{-\beta + \xi r_1}_{ =r_1^2 }} & -\beta \overline{r}_1 - \xi \end{pmatrix} \vphantom{\underbrace{\begin{pmatrix}1\\1\end{pmatrix}}_{=2}} \\ &=\frac{1}{1+|r_{1}|^2}\begin{pmatrix} r_1(1+|r_1|^2) & -\beta \overline{r}_1^2 + (1+\xi \overline{r}_1) \\ 0 & \beta\overline{r}_1 + \xi - r_1 \end{pmatrix}\\ &\le\begin{pmatrix} r_1 & -\frac{\beta \overline{r}_1^2 + 1+ (\overline{r}_1^2 + \beta)}{1+|r_1|^2} \\ 0 & \frac{\beta\overline{r}_1 + \xi - r_1}{1+|r_1|^2} \end{pmatrix}\\ &=\begin{pmatrix} r_1 & -\frac{(1+\beta)(1+\overline{r}_1^2)}{1+|r_1|^2} \\ 0 & \frac{\beta\overline{r}_1 + r_2}{1+|r_1|^2} \end{pmatrix} \end{align*} where we have used $\xi - r_1 = r_2$ in the last equation. Now in the complex case (and the case $4\beta = \xi^2$) we have $|r_1|=|r_2|=\sqrt{\beta}$ and $\overline{r_1} = r_2$ which results in \begin{align*} Q^* R Q &=\begin{pmatrix} r_1 & -(1+\overline{r}_1^2) \\ 0 & \overline{r}_1 \end{pmatrix}. \end{align*}