A unitary matrix taking a real matrix to another real matrix, is it an orthogonal matrix?

1.9k Views Asked by At

I tried to prove that a real antisymmetric matrix can be taken by an orthogonal tranformation to a form:

antisymmetric matrix transofrmation

where the eigenvalues are $\pm i\lambda_1, \pm i\lambda_2 ... $

which is a statement I saw on wikipedia in http://en.wikipedia.org/wiki/Antisymmetric_matrix

I also know an antisymmetric matrix can be diagonalized by a unitary transformation, and I found a unitary transformation taking the diagonal matrix to the required form.

So by composing the two transformations (diagonalization, then taking the diagonal matrix to the required form), I'll get a unitary transformation taking the real antisymmetric matrix to another real matrix.

My question is if this transformation must be a real matrix? if so I can deduce that the unitary transformation is in fact an orthogonal transformation.

So is this true?

Is a unitary transformation taking a real matrix to another real matrix necessarily an orthogonal transformation?

EDIT: After receiving in the comment here a counterexample, I'm adding:

Alternatively, if it is not necessarily orthogonal, does there necessarily exist an orthogonal transformation taking the two matrices to each other?

2

There are 2 best solutions below

3
On BEST ANSWER

Yes. Quoting Halmos's Linear algebra problem book (Solution 160).

“If $A$ and $B$ are real, $U$ is unitary, and $U^*AU = B$, then there exists a real orthogonal $V$ such that $V^*AV = B$.

A surprisingly important tool in the proof is the observation that the unitary equivalence of $A$ and $B$ via $U$ implies the same result for $A^*$ and $B^*$. Indeed, the adjoint of the assumed equation is $U^*A^*U = B^*$.

Write $U$ in terms of its real and imaginary parts $U = E + i F$. It follows from $AU = UB$ that $AE = EB$ and $AF = FB$, and hence that $A(E+\lambda F) = (E+\lambda F)B$ for every scalar $\lambda$. If $\lambda$ is real and different from a finite number of troublesome scalars (the ones for which $\det(E+\lambda F) = 0$), the real matrix $S = E + \lambda F$ is invertible, and, of course, has the property that $AS=SB$.

Proceed in the same way from $U^*A^*U = B^*$: deduce that $A^*(E+\lambda F) = (E+\lambda F)B^*$ for all $\lambda$, and, in particular, for the ones for which $E+\lambda F$ is invertible, and infer that $A^*S = SB^*$ (and hence that $S^*A = BS^*$).

Let $S =VP$ be the polar decomposition of $S$ (that theorem works just as well in the real case as in the complex case, so that $V$ and $P$ are real.) Since $$BP^2 = BS^*S = S^*AS = S^*SB = P^2B,$$ so that $P^2$ commutes with $B$, it follows that $P$ commutes with $B$. Since $$AVP = AS = SB = VPB = VBP$$ and $P$ is invertible, it follows that $AV=VB$, and the proof is complete.”

Needless to say, that isn't the shortest path to prove the reduction of antisymmetric matrices...

0
On

Here is an answer that works for normal matrices, in particular for skew-Hermitian matrices.

Every normal matrix is unitarily diagonalizable. So there is an ONB (orthonormal basis) consisting of eigenvectors. You can replace this complex-entried ONB by a real-entried ONB that replaces $2\times 2$ diagonal blocks with conjugated eigenvalues by $2\times 2$ block of the kind you want by using the following observation: If $u\in\mathbb{C}^n$ is an eigenvector corresponding to eigenvalue $\lambda i$, then write $u=x+iy$ where $x, y\in \mathbb{R}^n$. Since the matrix is real, it is then almost immediate that $x,y$ are perpendicular and of length $1/\sqrt{2}$. Using this, you can then replace the original complex-entried ONB by a real-entries ONB.