Every linear orthogonal transformation can be represented as a block matrix

216 Views Asked by At

How would you show that every linear orthogonal transformation can be represented as a block matrix, with the blocks either being 2x2 rotation matrices or $\pm1$? I have managed to show it for the 2 and 3 dimensional case but am having trouble generalising to n dimensions

1

There are 1 best solutions below

1
On

You can use induction over $n$, the space dimension (that is, prove that is true for $n=1$, or maybe $n=2$, which you have already done; and also prove that if the result is true for dimension $k<n$ it is also true for dimension $k=n$.)

A useful result is that if the orthogonal linear transformation $T$ is $S$-invariant (where $S$ is some vector subspace), then $T$ is also $S^\bot$-invariant.

Think also how the matrix of $T$ will look in a base which contains both a base of $S$ and of $S^\bot$. To be more precise: if $\mathcal{B}=\{v_1,\ldots,v_r,v_{r+1},\ldots,v_n\}$ is an orthogonal base of $V$ (not just any base), and if $\{v_1,\ldots,v_r\}$ is a base of a subspace $S$ (of course $\dim S=r$), then $S^\bot=\langle v_{r+1},\ldots,v_n\rangle$ (and those generators form a base of $S^\bot$, of course).

Now, if $T\colon V \to V$ is an $S$-invariant orthogonal endomorphism of $V$ (and consequently is also $S^\bot$-invariant) and if $\mathcal{B}$ is as above, what can you say about the structure of the matrix $[T]_\mathcal{B}$?

Then, as mentioned by @hardmath, you can show that there is always a proper subspace $S$ such that $T$ is $S$-invariant (think about the eigenvalues of $T$, for instance), and then use the fact that any proper subspace will have dimension strictly smaller than $n$, so you can use the inductive hypothesis.

Maybe you can do it following that scheme.