Proving that my algorithm works

253 Views Asked by At

I developed an algorithm to map eigenvectors onto a unique representation, removing the phase and dealing with degenerates. It works, but I don't know why. This is my algorithm:

Consider this matrix $$ H = \begin{pmatrix} 1.6375816690258 & -0.4755832684370 & -0.0699416837860 \\ -0.4755832684370 & 1.3547458407373 & 0.0521707197538 \\ -0.0699416837860 & 0.0521707197538 & 1.0076724902369 \end{pmatrix} $$

If I diagonalize it I get the eigenvalues $[1,1,2] $ and depending on which numerical library I use, I get different eigenvectors matrices for the 1 eigenvalues: $$ v1= \begin{pmatrix} 0.5839738582086 & -0.1462629956726 & -0.7984871126235 \\ 0.8016696443658 & -0.0507931153320 & 0.5956054404867 \\ -0.1276726839671 & -0.9879408866586 & 0.0875927521938 \end{pmatrix} \quad v2 = \begin{pmatrix} 0.6020119026848 & 0.0000000000000 & -0.7984871126235 \\ 0.7899898096965 & -0.1455000338086 & 0.5956054404867 \\ 0.1161799018825 & 0.9893582466234 & 0.0875927521938 \end{pmatrix} $$

The two column vectors are two different representations of the same space. In order to get them to a common representation I do the following:

  1. Pick the top 2x2 submatrix and transpose it $$ A1 = \begin{pmatrix} 0.5839738582086 & -0.1462629956726 \\ 0.8016696443658 & -0.0507931153320 \end{pmatrix} \quad A2 = \begin{pmatrix} 0.6020119026848 & 0.0000000000000 \\ 0.7899898096965 & -0.1455000338086 \end{pmatrix} $$
  2. Perform QR-decomposition with the restriction, that the diagonal elements must be positive.

$$ A1 = \begin{pmatrix} 0.5839738582086 & -0.1462629956726 \\ 0.8016696443658 & -0.0507931153320 \end{pmatrix} =\underbrace{\begin{pmatrix} 0.9700370633940 & 0.2429569831099 \\ -0.2429569831099 & 0.9700370633940 \end{pmatrix}}_{Q1} \underbrace{\begin{pmatrix} 0.6020119026848 & 0.7899898096965 \\ 0.0000000000000 & 0.1455000338086 \end{pmatrix}}_{R1} $$ $$ A2 = \begin{pmatrix} 0.6020119026848 & 0.0000000000000 \\ 0.7899898096965 & -0.1455000338086 \end{pmatrix} = \underbrace{\begin{pmatrix} 1.0000000000000 & 0.0000000000000 \\ 0.0000000000000 & -1.0000000000000 \end{pmatrix}}_{Q2} \underbrace{\begin{pmatrix} 0.6020119026848 & 0.7899898096965 \\ 0.0000000000000 & 0.1455000338086 \end{pmatrix}}_{R2} $$

This is the key part. R1 is always R2, but I don't know why.

  1. Apply unitary matrix to whole vector $$ v1'= \\ \begin{pmatrix} 0.5839738582086 & -0.1462629956726 \\ 0.8016696443658 & -0.0507931153320 \\ -0.1276726839671 & -0.9879408866586 \end{pmatrix} \begin{pmatrix} 0.9700370633940 & 0.2429569831099 \\ -0.2429569831099 & 0.9700370633940 \end{pmatrix}= \begin{pmatrix} 0.6020119026848 & 0.0000000000000 \\ 0.7899898096965 & 0.1455000338086 \\ 0.1161799018825 & -0.9893582466234 \end{pmatrix} $$ $$ v2' = \\ \begin{pmatrix} 0.6020119026848 & 0.0000000000000 \\ 0.7899898096965 & -0.1455000338086 \\ 0.1161799018825 & 0.9893582466234 \end{pmatrix} \begin{pmatrix} 1.0000000000000 & 0.0000000000000 \\ 0.0000000000000 & -1.0000000000000 \end{pmatrix}= \begin{pmatrix} 0.6020119026848 & 0.0000000000000 \\ 0.7899898096965 & 0.1455000338086 \\ 0.1161799018825 & -0.9893582466234 \end{pmatrix} $$

On the right hand site the top 2x2 matrix is just $R^T$. Since Q1 and Q2 are unitary matrices the norm should be conversed and v1' and v2' are still eigenvectors for the eigenvalues 1.

I have tested this algorithm extensively and I have never found a case where it doesn't work (within numerical inaccuracies).

How can I prove that R1 = R2?

If you want to play with the algorithm, then you can find it on github: https://github.com/JuDFTteam/unique_eigenvectors

1

There are 1 best solutions below

0
On

An explanation for this example:

For convenience, I replace $v1, v2, A1, A2$ with $V_1, V_2, A_1, A_2$, respectively.

You perform QR to get $A_1^\mathsf{T} = Q_1R_1$ and $A_2^\mathsf{T} = Q_2 R_2$ with some restriction (positive diagonal entries). You want to prove that $R_1 = R_2$.

Note that $V_1$ and $V_2$ have the same 3rd column (the eigenvector associated with eigenvalue $2$). Denote the first two elements of the 3rd column of $V_1$ by $u$, i.e. $$u = \begin{pmatrix} &-0.7984871126235 \\ &0.5956054404867 \end{pmatrix}.$$ Using $V_1 V_1^\mathsf{T} = I_3$, we have $A_1A_1^\mathsf{T} + uu^\mathsf{T} = I_2$. Using $V_2 V_2^\mathsf{T} = I_3$, we have $A_2 A_2^\mathsf{T} + uu^\mathsf{T} = I_2$. Thus, we have $A_1 A_1^\mathsf{T} = A_2 A_2^\mathsf{T}$.

Since $A_1^\mathsf{T} = Q_1R_1$ and $A_2^\mathsf{T} = Q_2 R_2$, we have $R_1^\mathsf{T}R_1 = R_2^\mathsf{T}R_2$.

Let $$R_1 = \begin{pmatrix} a_1 & b_1 \\ 0 & c_1 \end{pmatrix}, \quad R_2 = \begin{pmatrix} a_2 & b_2 \\ 0 & c_2 \end{pmatrix}.$$ From $R_1^\mathsf{T}R_1 = R_2^\mathsf{T}R_2$, we have $$a_1^2 = a_2^2, \quad a_1b_1 = a_2b_2, \quad b_1^2 + c_1^2 = b_2^2 + c_2^2$$ which results in $R_1 = R_2$ (using $a_1, a_2, c_1, c_2 > 0$).