Prove that a regular $\phi : E \to E$ can be uniquely decomposed as a composition of self-adjoint map and a rotation(Unitary trick of Weyl)

217 Views Asked by At

In the book(archive.org) of Linear Algebra by Greub, at page 226 it is asked that:

Note: $E$ is a $n$-dimensional real vector space.

Prove that a regular linear transformation $\phi$ of a Euclidean space can be uniquely written in the form $$\phi = \sigma \circ \tau,$$ where $\sigma$ is a positive self-adjoint transformation and $\tau$ is a rotation. Hint: Use problem 5 and 10.(This is essentlally the unitary trick of Weyl.)

Question 5:

A selfadjoint transformation $\phi$ is called positive, if $$(x,\phi(x)) \geq 0$$ for every $x\in E$. Given a positive selfadjoint transformation $\phi$, prove that there exists exactly one positive selfadjoint transformation $\psi$ such that $\psi^2 = \phi $.

Question 10:

Note: $\bar \phi$ is the adjoint mapping of $\phi$, which is defined as $$(x, \phi (y)) = (\bar \phi(x), y) \quad x,y \in E.$$

Let $\phi$ be any linear transformation of an inner product space $E$. Prove that $\phi \circ \bar \phi$ is a positive self-adjoint mapping. Prove that $$(x, \phi \circ \bar \phi(x)) \geq 0 \quad x\not = 0$$ with equality only if $x \in ker(\phi)$

Even though the author gives some hints how to prove this result, I have stuck at this problem for, I guess, a month, and still couldn't figure out how to prove it. Then I have asked this to one of my assistants, but she couldn't figure out either, so my main question is that how can we prove this result, but any kind of help, some more hints, or general idea of the proof, anykind, is also very welcomed.

Note that, I have put a link of the book directly copied from archive.org, so you can check it out yourself also.

2

There are 2 best solutions below

4
On BEST ANSWER

If a linear transformation is positive, any eigenvalue must be non-negative:

$$0\le (Ax,x)=(\lambda x,x)=\lambda (x,x)=\lambda ||x||^2$$

Section 8.9 in your book shows that a self-adjoint operator has an orthonormal basis of eigenvectors $e_j$ with eigenvalues $\lambda_j$. So if $A$ is positive and self adjoint and invertible, its effect on a vector $\sum v^je_j$ can be written

$$Av=A\sum v^je_j= \sum v^jAe_j=\sum v^j\lambda_j e_j$$

where the $\lambda_j$ are positive.

Given such an $A$, Consider the linear transformation $Bv=\sum v^j\sqrt{\lambda_j} e_j$. Then a straightforward computation shows that $B^2=BB=A$ (thus $B$ is a "square root" of A), and that B also is positive and self-adjoint and invertible.

$B$ is also the unique positive self-adjoint square root of $A$. To see this, note that since $B$ is positive and self-adjoint and invertible it also has an orthonormal basis of eigenvectors $f_j$ with positive eigenvalues $c_j$. Since $Af_j=BBf_j=Bc_jf_j=c_j^2f_j$, each $f_j$ is also an eigenvector of $A$ and the corresponding $c_j$ equals $\sqrt{\lambda_i}$ for some $i$. In other words, the effect of $B$ is to scalar multiply each $\lambda_j$ eigenspace of $A$ by $\sqrt{\lambda_j}$. This determines $B$ uniquely since the eigenspaces of $A$ span $E$.

Now we can use all of the above to prove the original claim. Given an invertible linear transformation $\phi$, consider $A=\phi\phi^T$. Then $A$ is positive and self adjoint: $$(x,Ax)=(x,\phi\phi^T x)=(\phi^T x, \phi^T x)=||\phi^T x||^2\ge0$$ $$(x,Ax)=(x,\phi\phi^T x)=(\phi^T x,\phi^T x)=(\phi\phi^T x,x)=(Ax,x)$$ and it is invertible, so it has a unique positive self-adjoint square root $B$ which is also invertible.

Now suppose $\phi=BU$ for some rotation $U$. Then we must have $U=B^{-1}\phi$. So we'll define U by this formula and prove it is a rotation, which is to say it preserves the inner product

$$(Uv,Uv)=(B^{-1}\phi v,B^{-1}\phi v)=(\phi^TB^{-1}B^{-1}\phi v,v)=(\phi^T A^{-1}\phi v,v)$$

$$=(\phi^T (\phi\phi^T)^{-1}\phi v,v)=(\phi^T (\phi^T)^{-1}\phi^{-1}\phi v,v)=(v,v)$$

So $U$ is a rotation. To prove uniqueness, suppose also that $\phi=CV$ for a positive self adjoint $C$ and a rotation $V$. Then

$$\phi\phi^T=CVV^TC=C^2$$

so that $C$ is also a positive self-adjoint square root of $\phi\phi^T$. But uniqueness of $B$ was proved above, so $C=B$. Then $V=C^{-1}\phi=B^{-1}\phi=U$, so the decomposition is unique.

3
On

Unfortunately, I don't see the connection to Weyl's unitary trick; I know this phenomenon by the name of Polar Decomposition.

For problem 5:

Note that there exists an orthonormal eigenbasis for $\phi$; this is proved earlier in the chapter.   Let the eigenbasis be $\{e_{\lambda}\}_{\lambda}$.   Now, let $f$ be a continuous function on some set containing the spectrum of $\phi$.   It might seem reasonable to define $f(\phi)$ to be the operator with identical eigenbasis, and eigenvalues $f(\phi)(e_{\lambda})=f(\lambda)e_{\lambda}$.

Exercise: This coincides with evaluation in the natural way if $f$ is a polynomial, or just analytic.

Back to the problem at hand: we want to solve $$\psi^2=\phi;$$ how would you solve this if the unknowns $\psi$ and $\phi$ were real numbers, not operators? Can you apply the argument involving $f$ I sketched, above?

For problem 10:

For self-adjointness, note that self-adjointness is involutive and an antihomomorphism.   Then mess around with the givens.

For positivity, mess around with $(\bar{\phi}x, \bar{\phi}x)$.

For the main problem:

Problem 10 suggests looking at $\phi\circ\bar{\phi}$.   If we know what $\sigma$ is and we know $\sigma$ is invertible, then we immediately know $\tau$; namely, $$\tau=\phi\circ\sigma^{-1}$$ Now substitute into $\phi\circ\bar{\phi}$.   If you correctly use the two problems to construct $\sigma$, you should be able to get a cancellation and show that $\tau\circ\bar{\tau}=I$, i.e. $\tau$ is orthogonal.   Then you have enough degrees of freedom to make it special orthogonal (and thus a rotation).  

But, of course, $\sigma$ is not guaranteed invertible from the two problems; they explicitly make you spend time working out what kernels are.   Think about what I just said: they explicitly make you spend time working out what kernels are.   Can you work out a direct-sum decomposition to fix the above argument?

A final remark:

The source by Brian Hall on the linked Wiki page is pretty good.   There, he uses my sketch for problem 5 more extensively.   Greub does not expect you to solve these problems by that route, but if you're interested, can you use logarithms and exponentials to obviate some of the difficulties above?