diagonalizing $ \begin{pmatrix} 0 & M^T \\ M & 0 \end{pmatrix} $ numerically

196 Views Asked by At

$M$ is an $(L \times L)$ dimensional real matrix.

$ H=\begin{pmatrix} 0 & M^T \\ M & 0 \end{pmatrix} $ is an $(2L \times 2L)$ dimensional real symmetric matrix.

Well, we can diagonalize $H$ using QR method, using Matlab eig(), Python, etc. And they will give us a normalized orthogonal matrix $V$, such that $H=VDV^{-1}$. However, $V$ is not always what I want. There are some symmetry aspects of $H$, which is ignored by this general method.

The unique structure of $H$ makes its eigenvalues and eigenvectors come in pairs , see Equation (1)

$\psi_i$ and $\phi_i$ are column vectors:

$$\begin{pmatrix} 0 & M^T \\ M & 0 \end{pmatrix} \begin{pmatrix} \phi_i \\ \pm\psi_i \end{pmatrix}=\pm \lambda_i \begin{pmatrix} \phi_i \\ \pm\psi_i \end{pmatrix} \qquad ...........(1)$$

Therefore, I wish to have a $V$ to have such a structure (where $\psi$ and $\phi $ are $L \times L$ matrices ): $$ V \sim \begin{pmatrix} \phi & \phi \\ \psi & -\psi \end{pmatrix} $$

Square Equation (1), we get two decoupled equations:

$$[M M^T ]\vec{\phi_i}=\lambda_i^2 \vec{\phi_i} \qquad ...........(2.a) \\ [M^T M]\vec{\psi_i}=\lambda_i^2 \vec{\psi_i} \qquad ...........(2.b) $$

However, there is some subtlety here:

(1) imples (2.a) (2.b)

But (2.a) (2.b) do not always imply (1), why this is happening?

I'm solving this problem numerically, which part of the math should I use, in order to diagonalize a matrix with extra symmetry?

Other subtleties :

If I diagonalize (1) directly, the symmetry of eigenvectors pairs may not hold, because the zero-eigenvectors are mixing easily.

$\lambda_i$ in (2.a) and (2.b) are the same in principle, but due to machine error, they are not exactly the same. This problem becomes critical when $\lambda_i$ approaches 0, the relative mismatch $\frac{|\lambda^2_{a i}-\lambda^2_{b i}|}{|\lambda^2_{a i}+\lambda^2_{b i}|}$ becomes significant.

2

There are 2 best solutions below

0
On BEST ANSWER

What you want is to compute the singular value decomposition $$ M = \psi\Lambda\phi^T, $$ where $\psi$ and $\phi$ are $L\times L$ orthogonal matrices and $\Lambda$ is diagonal with nonnegative entries. Computing the SVD is numerically stable, and the SVD decomposition above implies $$ M\phi = \psi\Lambda \quad\mbox{and}\quad M^T \psi = \phi\Lambda, $$ which is equivalent to the equation $$ H\pmatrix{\phi &\phi\\ \psi &-\psi} = \pmatrix{\phi &\phi\\ \psi &-\psi} \pmatrix{\Lambda & 0\\0 & -\Lambda} $$ that you seek.

3
On

(2.a) and (2.b) are not independent. We can in fact, solve (2.a) first $$ MM^T \phi_i = \epsilon_i^2 \phi_i $$ Then, $\psi_i$ can be give by: $$ \psi_i=\frac{1}{|\epsilon_i|} M^T \phi_i $$

in this way, (2) $\rightarrow $ (1) can be guaranteed.

But, when $|\epsilon_i | \rightarrow 0 $ , the numerical solution of $\psi_i$ is very unstable, and not normalized.