Matrices Commuting with a Kronecker Sum

528 Views Asked by At

Throughout, let $A$ and $B$ be complex $m \times m$ and $n \times n$ matrices respectively. By $A \otimes B$, we mean the matrix formed from the Kronecker product of $A$ and $B$, and by $A \oplus B$, we mean the matrix formed by the Kronecker sum of $A$ and $B$. Namely, $$A \otimes I_n + I_m \otimes B,$$ where $I_r$ is an $r \times r$ identity matrix.

Let $C$ be an arbitrary $mn \times mn$ matrix that commutes with $A \oplus B$. What (if anything) can be said about $C$?

For example, one could write $C$ as $$C=\sum_{k=1}^m\sum_{l=1}^n (X_{kl} \otimes e^n_{kl})=\sum_{i=1}^m\sum_{j=1}^n (e^m_{ij} \otimes Y_{ij}),$$ where $e^r_{ij}$ is the standard $r \times r$ matrix with a 1 as the $(i,j)$-th entry and every other entry is 0, and $X_{kl}$ and $Y_{ij}$ are $m \times m$ and $n \times n$ matrices respectively. If $C$ commutes with $A \oplus B$, then does every matrix $X_{kl}$ commute with $A$ and every matrix $Y_{ij}$ commute with $B$?

I have found no counterexamples thus far to the above, but I also fail to see why it might be true in general. If we write $$C(A \oplus B)=(A \oplus B)C$$ then we can deduce from the mixed-multiplication property of Kronecker products that $$\sum_{k=1}^m\sum_{l=1}^n ((X_{kl}A-A X_{kl}) \otimes e^n_{kl}) + \sum_{i=1}^m\sum_{j=1}^n (e^m_{ij} \otimes (Y_{ij}B-B Y_{ij}))=0,$$ but it doesn't seem clear to me at all that one might be able to deduce from this that $X_{kl}A-A X_{kl}=0$ and $Y_{ij}B-B Y_{ij}=0$ for any $i,j,k,l$.

1

There are 1 best solutions below

7
On

Let $spectrum(A)=(\lambda_i)_{i\leq m},spectrum(B)=(\mu_j)_{j\leq n}$.

We consider the case when $A,B$ are generic (for example take random $A,B$). Then the $(\lambda_i)$ (resp. the $(\mu_j)$) are distinct.

Moreover $spectrum(A\oplus B)=(\lambda_i+\mu_j)_{i,j}$ has $mn$ distinct elements - Note that $A\otimes I$ and $I\otimes B$ commute-.

Then $C(A\oplus B)$ is a vector space of dimension $mn$ constituted by the polynomials in $A\oplus B$.

Finally, the $mn$ linearly independent matrices in the form $A^i\otimes B^j,i< m,j<n$ constitute a basis of $C(A\oplus B)$.

EDIT. Answer to the OP. I think you did not understand one word of my post.

For i) A generic matrix $A=[a_{i,j}]$ is s.t. there are no algebraic relations between the $(a_{i,j})$. More precisely, the $(a_{i,j})$ are said to be parameters (they are mutually transcendental over $\mathbb{C})$. You can simulate such a matrix by choosing it at random. Do this with your PC instead of writing pseudo counter examples; you will find in particular that for such matrices $A,B$, the $\lambda_i+\mu_j$ are distinct.

For ii). My friend, $C(A\oplus B)$ is the commutant of $A\oplus B$ (well-known notation) and, therefore, is a vector space. On the other hand, the commutant of a matrix that has distinct eigenvalues is constituted with the polynomials in this matrix.

For iii). Your counter-examples are only particular well-known cases (all that you write is absolutely standard and is not the object of my post). With probability $1$, the commutant of your matrix admits the $(A^i\otimes B^j)$ as basis.

For iv). When one does not understand, one asks. I do not intend to waste any more time with your file.