Is there a particular condition under which two covariance matrixes A and B commute, so that $AB=BA$? Or it’s just about numerical case-by-case chance and we have to verify on a case-by-case basis?
I mean: is it possible to define a clear set of conditions under which we can be sure that given two generic covariance matrixes A and B, they must commute? Do you have any examples?
In fact, two covariance matrix over vector finite-dimensional space $V$ commute if and only if they are simultaneously diagonalisable.
Suppose $\exists$ a $P$ invertible s.t. $P^{-1}AP$ and $P^{-1}BP$ are both diagonal.
Then $AB=P(P^{-1}AP)(P^{-1}BP)P^{-1}=BA$ so $A$ and $B$ commute.
Conversely suppose $A$ and $B$ commute. The idea is to consider each eigenspace of $A$ individually and then diagonalise $B$ in each of the eigenspaces.
Since $A$ and $B$ are symmetric, they are diagonalisable. Let ${\lambda}_1,...{\lambda}_k$ be distinct eigenvalues of A.
We can write $V=\oplus_{i=1}^{k}E_A({\lambda}_i)$.
We now show that $B$ sends $E_A(\lambda_i)$ to itself.
Let $\mathbf v\in E_A({\lambda}_i)$. Then $AB\mathbf v=BA\mathbf v=\lambda B\mathbf v$
so $B\mathbf v\in E_A({\lambda}_i)$
Since $B$ is diagonalisable in $V$, it is diagonalisable in $E_A({\lambda}_i)$ i.e.there is a basis $\mathfrak B_i$ for $E_A({\lambda}_i)$ of eigenvectors of $B$. We do this for all $i$. Then $\mathfrak B= \bigcup_{i=1}^k\mathfrak B_i$ is a basis for V consisting of eigenvectors for both $A$ and $B$. Hence $A$ and $B$ are simultaneously diagonalisable.