Suppose $A_1,A_2,A_3,\ldots$ is an infinite sequence of $d\times d$ matrices sampled IID from some distribution. Under which conditions does the product converge to zero almost surely?
The hard case is when $A_i$ are rank-deficient so matrix logarithm is not defined. Here one such case is addressed with $A=I-x_ix_i^T$ and isotropic 2-D Gaussian $x_i$. Can this be generalized?