Proposition: Given two square matrices $A,B \in \mathbb{C}^{n \times n}$ which fulfil $AM = MB$ for all $M \in \mathbb{C}^{n \times n}$ they must be identical and a scalar multiple of the identity matrix: $A = B = \lambda \mathbb{1}$, $\lambda \in \mathbb{C}$.
Proof: Since the relation $AM = MB$ holds for any matrix $M$, we can in particular choose the matrices $(M_{mn})_{ij} = \delta_{mi} \delta_{nj}$ (all elements are zero, except for $(m,n)$, which is $1$). Thus, $$ \sum_{j=1}^n A_{ij} (M_{mn})_{jl} = \sum_{j=1}^n (M_{mn})_{ij} B_{jl} \Leftrightarrow \sum_{j=1}^n A_{ij} \delta_{mj} \delta_{nl} = \sum_{j=1}^n \delta_{mi} \delta_{nj} B_{jl} \Leftrightarrow A_{im} \delta_{nl} = \delta_{mi} B_{nl} \,. $$ Now, we distinguish the cases
- $n=l$, $m=i$: $A_{ii} = B_{ll}$, i.e. all diagonal elements must be the same: $A_{ii} = B_{ii} = \lambda$.
- $n=l$, $m\neq i$: $A_{im} = 0$, i.e. all off-diagonal elements of A must vanish.
- $n\neq l$, $m=i$: $0 = B_{nl}$, i.e. all off-diagonal elements of B must vanish.
Thus, $A=B=\mathrm{diag}(\lambda,\dots,\lambda)$.
Question: This proposition reminded me a bit of Schur's lemma from group representation theory, but I think that this case is not covered by Schur's lemma since $A$ and $B$ can be different and $M$ does not need to be invertible.
I was wondering whether the proposition is just a special case of a more general form of Schur's lemma or whether there is some other well known theorem that I could refer to instead of the proof above. Bonus question: Is there a proof of this statement that doesn't require to go to component notation?