If three nonzero real matrices mutually anticommute, then at least one of them has a negative off-diagonal element

119 Views Asked by At

The following is a statement which I believe to be correct but unable to prove (I have been trying to find a counterexample for a long time but never succeeded in doing so):

Let $X_1,X_2,X_3$ be $n\times n$ nonzero real matrices satisfying $X_{i}X_j=-X_jX_i\neq 0$ for $1\leq i<j\leq 3$ [as an example, $\{X_1,X_2,X_3\}=\{\sigma^x,i\sigma^y,\sigma^z\}$, the Pauli-matrices]. Prove that at least one of $X_1,X_2,X_3$ has a negative off-diagonal element.

The case $n=2$ is easy and straightforward: just expand $X_1,X_2,X_3$ in the basis of Pauli matrices and solve the equation $\{X_i,X_j\}=0$. But when $n$ becomes large, the coupled quadratic equations quickly becomes formidable, which I have no way to handle.

Notice also that the $\neq 0$ condition is important, otherwise we got a simple counter-example: $X_1=X_2=X_3=\sigma^+$, which mutually anti-commute but are all non-negative. 

2

There are 2 best solutions below

0
On BEST ANSWER

This isn't true, at least when the matrices are allowed to be singular. Let $$ D=\pmatrix{1&0\\ 0&-1}, \ S=\pmatrix{0&1\\ 1&0}. $$ Then $DS$ and $SD$ are nonzero but $DS+SD=0$. The following set of matrices now serves as a counterexample to your statement: $$ X_1=\pmatrix{D\\ &D\\ &&0}, \ X_2=\pmatrix{S\\ &0\\ &&D}, \ X_3=\pmatrix{0\\ &S\\ &&S}. $$

0
On

This is just a start. As you say, the coupled quadratic equations seem impossible to deal with, but I think we can make progress proceeding by contradiction.

It's easy to prove that not all the off-diagonal elements can be strictly positive. Assume, on the contrary, that they are.

Let the three matrices be $$A=(a_{ij})_{n\times n},\,B=(b_{ij})_{n\times n},\,C=(c_{ij})_{n\times n}$$

For $1\leq i\leq n,$ we have $$\begin{align} \left(AB\right)_{ii}&=a_{ii}b_{ii}+\sum_{k\neq i}a_{ik}b_{ki}\tag{1}\\ \left(BA\right)_{ii}&=b_{ii}a_{ii}+\sum_{k\neq i}b_{ik}a_{ki} \end{align}$$

Since both sums are strictly positive, in order that $\left(AB\right)_{ii}= -\left(BA\right)_{ii}$, $a_{ii}$ and $b_{ii}$ must have opposite signs. But the same is true of $a_{ii}$ and $c_{ii}$ and of $b_{ii}$ and $c_{ii}$, and this is clearly impossible.

Now consider the case where all the off-diagonal elements in the three matrices are nonnegative. Proceeding as in $(1)$, we see that either $a_{ii}$ and $b_{ii}$ have opposite signs, or at least one of them is $0$. Then at least one of $a_{ii}, b_{ii}, c_{ii}$ is $0$, as above. Suppose that $a_{ii}=0$. Then both sums in $(1)$ must be $0$, so that for $k\ne i$ at least one of $a_{ik},\, b_{ki}$ is $0$ and at least one of $b_{ik},\, a_{ki}$ is $0$. Of course, a similar statement holds with $B$ in place of $C$.

We see that a counterexample will have a lot of $0$ elements. My thought is to either show that there are so many $0$s that one of the matrix products must be $\mathbf{0}$, or to eliminate so many variables that it's possible to find a counterexample.

I've been trying to exploit the facts above to the equation $\left(AB\right)_{ij}=-\left(AB\right)_{ij}$ when $i\neq j,$ but it's a liitle more complicated.

I think the basic problem is to find a compact way to express the facts about the positions of the $0$s that will lend itself to further computation easily, but so far, I haven't gotten anywhere.