Vanishing of a sum involving second minors of a unitary matrix

71 Views Asked by At

Let $U$ be a $4 \times 4$ unitary matrix and let $M_{\{i_1i_2\}\{j_1j_2\}} = U_{i_1j_1}U_{i_2j_2}-U_{i_1j_2}U_{i_2j_1}$ denote its second minors (where $i_1<i_2$ and $j_1<j_2$). If $I$ and $J$ are 2-element subsets of $\{1,2,3,4\}$, then by the Cauchy–Binet formula we know that $$\sum_K \bar{M}_{KI}M_{KJ} = \begin{cases} 1 & \textrm{if } I = J \\ 0 & \textrm{if } I\neq J \end{cases}$$ where the sum runs over all 2-element subsets of $\{1,2,3,4\}$ and bar denotes complex conjugation.

Suppose now that $a_K$ are some given positive, distinct coefficients and the unitary matrix $U$ is such that $$(\star) \qquad \sum_K a_K\bar{M}_{K,\{1,2\}}M_{KJ} = 0$$ for $J = \{1,3\}, \{1,4\}, \{2,3\}, \{2,4\}$. Does this imply that $(\star)$ holds also for $J = \{3,4\}$?

I've tried to check it numerically, but I have trouble even with finding examples of unitary matrices that satisfy $(\star)$. Any hint would be much appreciated.

EDIT: I've learned that the problem seems algebraic-geometrical and that the Plucker relations might be of use. If so, how?

1

There are 1 best solutions below

1
On BEST ANSWER

No, this doesn’t hold. For simplicity, let’s work with real orthogonal matrices. Consider the six $6$-dimensional vectors $x_J$ with $\left(x_J\right)_K=M_{KJ}$. Your first equation says that the $x_J$ are orthonormal, and your second equation says that $x_{\{1,2\}}$ and $4$ of the other $x_J$ are $A$-orthogonal with $A=\operatorname{diag}\left(a_K\right)$. This second equation imposes $4$ linear constraints on the $a_K$, leaving at least a $2$-dimensional space of solutions. We know that $a_K=1$ is one solution. Adding a sufficiently small multiple of a second independent solution to $a_K=1$ yields positive $a_K$, and there’s no apparent reason why they shouldn’t be distinct. Now for $(\star)$ to hold for all five $J\ne\{1,2\}$, $a_K$ has to satisfy five linear constraints, and if $x_{\{1,2\}}$ has non-zero components (which it generally has), these constraints are linearly independent because their normal vectors are the independent $x_J$ multiplied by the invertible matrix $\operatorname{diag}(x_{\{1,2\}K})$. That leaves only a one-dimensional subspace that satifies all five constraints; but we know that $a_K=1$ satisfies them, so the independent solution with distinct $a_K$ can’t also satisfy them.

Since there being no apparent reason for the components of the second independent solution not to be distinct doesn’t exclude the possibility of there being an inapparent reason for it, I checked by writing this Java code that generates a random orthogonal $4\times4$ matrix and finds the corresponding $a_K$. Here’s the result of one run, with distinct $a_K$:

0.5689404887917452 0.35488642908177226 0.178089199100431 0.7201712156360913 
-0.5260053343296579 0.8354451257813722 0.15541669620170148 -0.03457572279273246 
0.41491940358169704 0.10570205096749075 0.7123820276623269 -0.5560402967558973 
0.47693500291084046 0.4060962209661677 -0.6607887214300998 -0.41349380641819544 

a_0 : 0.8802879448836061
a_1 : 0.38448998107210963
a_2 : 1.391119069321157
a_3 : 0.8917858631363342
a_4 : 0.8139470860135438
a_5 : 1.638370055573215