How to block diagonalize a orthogonal matrix?

140 Views Asked by At

I am trying to block diagonalize a real orthogonal matrix, A. The condition is that the blocks should also be orthogonal. I found this pretty old yet abstract paper that says "By block diagonalization methods one can obtain eigenvalues and eigenvectors while simultaneously “reducing” the size of the matrix, i.e., $\sigma(A) = \sigma(\Lambda_1) \cup \sigma(\Lambda_2) $"

However, I am not a math major and can not understand much of it. So I am asking here if it is possible or implemented somewhere to block diagonalize a matrix. The matrices that I want to block diagonalize are always real, square, and orthogonal (satisfying $A^T(AA^T)^{-1}A = 1$).

As a sample matrix, $ A = \begin{bmatrix} -0.874030050 & 0.634268244 & 0.001600016 & 0.000706214 \\\ 0.917305750 & 0.569888842 & 0.000391318 & -0.001513837 \\\ 0.001600016 & 0.000706214 & -0.874030050 & 0.634268244 \\\ 0.000391318 & -0.001513837 & 0.917305750 & 0.569888842\end{bmatrix}$

Say, I want two 2x2 blocks, along the main diagonal. How to obtain this?

EDIT1 Is it possible to obtain a matrix like this?

$ A = \begin{bmatrix} -0.874030050\pm c1 & 0.634268244\pm c2 & 0.0000001 & 0.00000004 \\\ 0.917305750\pm c3 & 0.569888842\pm c4 & 0.00000008 & -0.00000037 \\\ 0.000000016 & 0.000000014 & -0.874030050\pm c5 & 0.634268244\pm c6 \\\ 0.000000018 & -0.00000037 & 0.917305750\pm c7 & 0.569888842\pm c8\end{bmatrix}$ What it means is that the $c_i$ are some corrections to the desired matrix blocks and the rest are zero or negligible. The Jordan Normal form is an upper triangular matrix and this is not what I want.

I think it can be made possible with the paper that I am refering, but it is pretty old and I need an interpretation too, it contains a lot of terms that I do not know. Probably there are updated methods as well.

EDIT 2 I have read from here that Givens Rotation can be applied to obtain orthonormal submatrices. I have applied multiple Givens Rotation to the matrix provided above. I am providing the result below. However, the desired result has not yet been obtained. $ A = \begin{bmatrix} -0.909289121 & 0.598620192 & -0.001459999 & -0.0000000 \\\ 0.909289121 & 0.598620192 & 0.001459999 & -0.00000000 \\\ -0.00000000 & -0.000959999 & 0.909290293 & 0.598619423 \\\ -0.00000000 & -0.000959999 & -0.909290293 & 0.598619423\end{bmatrix}$

I have used only two rotations and got the above result. Performing more rotations causes more harm. Is there any modified algorithm to this?

1

There are 1 best solutions below

0
On

As I understood Exercises to Chapter 4 of [Bel], we can proceed as follows. Let $N\ge 2$ be a natural number, $A$ be a $N\times N$ orthogonal matrix, and $\lambda\ne\pm 1$ be an eigenvalue of $A$ such that $|\lambda|=1$. Let the corresponding eigenvector is $x+iy$, where $x$ and $y$ are real. Then the vectors $x$ and $y$ are orthogonal. Now let $T_1=(x,y,x^3,x^4,\dots,x^N)$ be an orthogonal matrix whose two columns are $x$ and $y$. Then
$$T_1^TAT_1=\begin{Vmatrix} \begin{pmatrix} \cos\varphi_1 & - \sin\varphi_1 \\ \sin\varphi_1 & \cos\varphi_1 \end{pmatrix} & 0\\ 0 & A_{N-2}\end{Vmatrix}, $$
where the matrix $A_{N-2}$ is orthogonal too. Proceeding by induction, we can build an orthogonal matrix $T$ such that $T^TAT$ is a block-diagonal matrix whose blocks are $\begin{pmatrix} \cos\varphi_k & - \sin\varphi_k \\ \sin\varphi_k & \cos\varphi_k \end{pmatrix}$ or $(\pm 1)$.

References

[Bel] Richard Bellman, Introduction to matrix analysis, McGraw-Hill book company, Inc., New York-Toronto-London, 1960, Russian translation, Nauka, Moscow, 1969.