How to show that exp is a diffeomorphism between symmetric reals and positive definite matrices?

1k Views Asked by At

I am looking for an easy proof of the fact that the exponential function is a diffeomorphism between the finite dimensional vector space of symmetric real nxn-matrices and the open subset of positive definite symmetric real matrices.

I know that the exp-map is a bijection and that it is smooth. So, there are two ways to proceed:

One way would be to use the Inverse Function Theorem. But this would require to calculate the derivative of exp at a matrix A in the direction of another direction B and then we would have to show that this linear map (seen as a linear map in terms of B) is ivertible. I have no clue how to do that. I know that it suffices to show this for a real diagonal matrix A since we can always diagonalize real symmetric matrices, but since it is not possible to diagonalize A and B simultaniously, it is no fun to calculate the matrix exponential...

A totally different way would be not to use the Inverse Function Theorem, but to construct the inverse (the "logarithm") directly and show that it is smooth. One could use the series expansion of the real logarithm and then plug in positive definite matrices, but this seems to be complicated again...

Are there any ideas how to show this without introducing too much complicated machinery?

Thanks very much in advance, Tom

2

There are 2 best solutions below

6
On

Define $\log(A)$ as follows. Since $A$ is positive definite, $A = U^{-1} D U$ for some orthogonal $U$. Then $\log(A) = U^{-1} \log(D) U$, where $\log(D)$ simply applies $\log$ to the diagonal entries. Clearly this is a two sided inverse to $\exp$. To show it is smooth, use $$ \log(A) = \frac1{2\pi i}\oint \log z \, (z I-A)^{-1} \, dz$$ Use a counterclockwise contour that stays in the right half plane, and includes all the eigenvalues of $A$. Use the standard branch of $\log$ that has a cut along the negative real axis. To show the formula, first diagonalize $A$, and then use the Cauchy integral formula on each diagonal entry.

2
On

Thanks for your answer!

In the meantime, I found another possibility and for completeness sake, I want to show it here:

Let X be a real symmetric matrix. Without loss of generality, we may assume that X is diagonal with entries (lambda_1, ... lambda_n).

We define the following function h of two real variables as: h(x,y):=(exp(x)-exp(y))/(x-y) for x noteq y and h(x,x):=exp(x).

This function is symmetric in x and y and is always >0 .

Now, let e_{i,j} denote the elementary matrix with one 1 at position (i,j) and zeroes otherwise.

Then we may form the directional derivative of the matrix exponential function at point X in the direction e_{i,j} and obtain:

h(lambda_i,lambda_j) * e_{i,j} .

Now, we fix the following basis of the real vector space of symmetric matrices:

e_{i,j}+e_{j,i} for all (i,j) with 1\leq i\leq j \leq n.

This yields:

The directional derivative of exp at point X in direction e_{i,j}+e_{j,i} is h(lambda_i,lambda_j) * (e_{i,j}+e_{j,i}) and therefore each basis vector is mapped to a positive multiple of itself.

So, the linearisation of exp at point X is a diagonalisable with eigenvalues h(lambda_i,lambda_j)>0.

Hence, the linearisation at point X is invertible and hence exp a local diffeo.

Together with the bijectivity, we get a global diffeo.

The only technical thing at this proof is the calculation of the directional derivative at point X in the direction e_{i,j}, which can be reduced to a calculation of an upper triangular 2x2-matrix.

Do you think, this proof works?

Thank you, Tom