Suppose $A_i$ are IID samples of a random matrix-valued variable. I'm interested in determining whether the following infinite product is likely to diverge
$$A_1 A_2 A_3\cdots$$
Finding necessary+sufficient condition is hard (Theorem 2).
There are cheap to compute sufficient conditions for the product to converge.
Are there cheap to compute sufficient conditions for the product to diverge?
Of particular interest is case of $A_i=(I-x_i x_i^T)$ where $x_i$ are IID Gaussian. The case for isotropic Gaussian is solved here, I'm looking for insight into non-isotropic case
You generate a large enough sample of $x_i$ from a non-isotropic Gaussian distribution. Then for a sufficient number of samples n, generate matrices $A_1,A_2,\dots, A_n$. Then calculate the norm of product of these matrices. By Oseledets theorem, there is a $\lambda$ such that \begin{equation} \lim_{n \to \infty} \frac{1}{n} \log{(\lVert A_1 A_2 \cdots A_n \rVert)} = \lambda \end{equation} Where if $\lambda>0$, the product will almost surely diverge, while if $\lambda < 0$, the product will almost surely converge. For sufficient $n$, this can be estimated, which will provide reasonable parameters for the conditions you want. I don't think this answers the question in the way you were hoping unfortunately.