Matrices and Linear Algebra

228 Views Asked by At

Let A be a real $n × n$ orthogonal matrix. Let $X$ be a complex eigenvector of A with complex eigenvalue λ.

Prove that $X^TX$ = $0$.

Write $X = R + Si$ where R and S are real vectors. Prove that W spanned by R and S is A-invariant and describe the restriction of A to W.

Prove that there exists a real orthogonal matrix P such that $P^tAP$ is a block diagonal matrix with each block of size 1 × 1 or 2 × 2.

1

There are 1 best solutions below

3
On BEST ANSWER

Regarding the $\textbf{first claim}$ note $$ X^T X = X^TA^TAX = (AX)^T(AX) = \lambda^2 X^TX $$ which implies $\lambda \in \{ 1, -1\}$ or $X^TX = 0$. But you assume $\lambda$ to be complex.

Regarding the $\textbf{second claim}$ note $AR + iAS = AX = \lambda (R+iS)$ which implies $$ AR = \operatorname{Re}(\lambda(R+iS)) = \operatorname{Re}(\lambda)R + \operatorname{Re}(i\lambda)S \\ AS = \operatorname{Im}(\lambda(R+iS)) = \operatorname{Im}(\lambda)R + \operatorname{Im}(i\lambda)S$$ where $\operatorname{Re}$ denotes the real part $\operatorname{Im}$ respectively. But this means that $AR$ and $AS$ are in $W$. Hence $W$ is $A$-invariant.

For the $\textbf{last claim}$ we note that the first part implies that $R$ and $S$ are orthogonal because $$0 = X^TX = \lVert R \rVert_2^2 + 2i R^TS - \lVert S \rVert_2^2$$ but this means $R^TS=0$ as there is no imagniary part on the left side. Now we can complete $\{R,S\}$ to be basis of $\mathbb{R}^n$. Denote it by $\{R,S,V_3,...,V_n\}$ and define a new matrix $B$ by $$ Av_j = \sum^n_{i=1} B_{ij} v_i. $$ $B$ has the form \begin{pmatrix} * & * \\ 0 & * \end{pmatrix} where the block in the upper left corner has dimension 2x2 and the lower right block is zero because $W$ is $A$-invariant. We repeat this procedure for all eigenvalues where the non-complex eigenvalues lead to the 1x1 blocks.

The existence of the $P$ part basically follows from the fact that we are doing a change of basis between orthogonal bases.