I was wondering... Consider a couple of different square matrices that act as linear transformations. I am going to apply these two linear transformations in an iterative manner, but I will select one option or the other with 50% chance (for example). We can obviously know the eigenvalues of our two matrices. But... can we know the expected eigenvalues from the eigenvalues distribution that is generated over all the posible combinatorics of matrix multiplication?
Many thanks in advance!
Let's consider a simple case first. Let $A,B$ be simultaneously diagonalizable matrices. WLOG, assume they are diagonal. Then, the problem becomes computing infinite products of discrete random variables $X_n$, each of which takes values $\lambda$ or $\mu$. If either of these eigenvalues are $0$ (and appear in the product), then the result is trivial. Otherwise, if either $\lambda$ or $\mu$ are not equal to 1, then the product will diverge for (almost) any distribution (from which $X_n$ is chosen) for which there is a non-zero, non-symmetric probability of getting $\lambda$ or $\mu$.
Now for the general (far harder) case: My educated guess is that the "eigenvalues of the overall process" will be undefined or fluctuate wildly when permuting the infinite word. There are likely many more details here, but this is as far as I can confidently say.