For which orthogonal matrices does the matrix exponential converge?

1.7k Views Asked by At

Part (a)

For which 2×2 orthogonal matrices A does $\large e^A=I+\frac{A^1}{1!}+\frac{A^2}{2!}+…$ converge?

Part(b)

For what A does the series converge to an orthogonal matrix?

My work:

Let A be 2x2 and orthogonal. Then $A^tA = AA^t = I$ and so this implies that A is normal. Over the ground field = $C$, A is then orthogonally / unitarily diagonalizable.

We can write $A=QDQ^*$, where Q is unitary and D is diagonal with the eigenvalues of A on the diagonal. Also, since A is assumed to be orthogonal, then the modulus of each eigenvalue is 1.

Now $$e^A=I+\frac{A^1}{1!}+\frac{A^2}{2!}+…$$

$$ e^A=I+\frac{QDQ^*}{1!}+\frac{QD^2Q^*}{2!}+…$$

$$ e^A= Q(I+\frac{D}{1!}+\frac{D^2}{2!}+…)Q^*$$

$$ e^A= Qe^DQ^*$$

Where $e^D$ is again diagonal.

What can I say from here? I know that online sources such as Wikipedia and Wolfram just state without any proof or extended discussions that the matrix exponential is well-defined and converges for any square matrix. If this is stated as a fact without proof, then it seems a little strange that I am working on a problem statement that asks "for which orthogonal 2x2 matrices A does $e^A$ converge". Is there an important point that I am overlooking? Or can I really just state that the matrix exponential converges for any square matrix A, hence it is well-defined and converges for any 2x2 orthogonal matrix A?

Any suggestions and hints for how to finish part (a) and how to start on part (b) are welcome.

Thanks,

2

There are 2 best solutions below

3
On BEST ANSWER

I know that online sources such as Wikipedia and Wolfram just state without any proof or extended discussions that the matrix exponential is well-defined and converges for any square matrix.

$\quad$ Every matrix has an element of maximal size. $($Obviously, if anything can cause

divergence, it's that one$).~$ Let its absolute value be $M.~$ So let us construct a square

matrix S, whose every single element is $M.~$ Then $S^k=\Big(n^{k-1}M^k\Big)_{n\times n}~,~$ and each

element of $A^k$ lies between $\pm~n^{k-1}M^k.~$ But $~e^S\approx\bigg(\dfrac{e^{nM}}n\bigg)_{n\times n}~,~$ so every element

of $e^A$ is definitely bounded. However, even in this case divergence could still theoret–

ically happen, if at least one such element $($not necessarily the same$)$ were to freely

oscillate inside a given range, without actually converging to any particular value

within that interval. But this is not possible, since each new term of the infinite series

decreases at an exponential rate, being trapped between $\pm~\dfrac{n^{k-1}M^k}{k!}.$

15
On

The fact that the series for $e^A$ is easy to prove using matrix norms. In particular, you should try to prove the inequality $$ \left\|e^A \right\| \leq e^{\|A\|} $$ if $\|\cdot\|$ is a norm (such as the Frobenius norm) which satisfies $\|AB\| \leq \|A\| \cdot \|B\|$ for all matrices $A,B$.


It is, in fact, a well known result that a matrix $A$ will be such that $e^{tA}$ is orthogonal for every $t \in \Bbb R$ if and only if $A$ is skew-symmetric, which is to say that $A^T = -A$. This is a commonly used result in the context of Lie Groups and Lie Algebras.

At the very least, you should try to prove that if $A^T = -A$, then $e^A$ is orthogonal.

I am not sure whether there are any other matrices $A$ for which $e^A$ is orthogonal.


A matrix $A$ such that $e^A$ is orthogonal but $A\neq A^T$: $$ \pmatrix{0&1\\0&2\pi i} $$ or better yet $$ \pmatrix{0&-1\\ 4 \pi^2 & 0} $$