Can anyone suggest a way to bound expected number of 0 eigenvalues in the following product of random matrices?
$G_1 B_1 G_2 B_2 \ldots G_n B_n$
where $G_i$ is $d$-by-$d$ with entries sampled from standard normal, while $B_i$ has 0,1 entries sampled uniformly.
It seems as $n$ grows, an increasing number of singular values are going to 0.

For instance, for $d=256, n=100$, I'm seeing 5-25 non-zero eigenvalues in this product (notebook)
Note that I am using exact arithmetic rather than numerical.
Given two $d \times d$ matrices $B$ and $C$, consider $BGC$ where $G$ is a random matrix with a density in $\mathbb R^{d \times d}$ (e.g. Gaussian). Let $r_C$ and $n_C$ be the rank and nullity of $C$, and similarly $r_B$ and $n_B$. $\text{ran}(C)$ is a vector space of dimension $r_C = \text{rank}(C)$. With probability $1$, $(G (\text{ran}(C))) \cap \ker(B) = \{0\}$ if that is possible, i.e. unless $r_C + n_B > d$, which is equivalent to $n_B > n_C$. Thus if $n_B \le n_C$, $n_{BGC} = n_C$ with probability $1$. Similarly, if $n_B \ge n_C$, $n_{BGC} = n_{C^\top G^\top B^\top} = n_{B^\top} = n_B$. So with probability $1$, $n_{G_1 B_1 G_2 B_2 \ldots G_n B_n} = \max(n_{B_1}, \ldots, n_{B_n})$.
So yes, $n_{G_1\ldots B_n}$ (which is the geometric multiplicity of the eigenvalue $0$) is always nondecreasing with $n$; we have $\mathbb P(n_{G_1 \ldots B_n}) \ge k) = 1 - \mathbb P(n_{B_i} < k)^n$. With probability $1$, $n_{G_1 \ldots B_n}$ will eventually hit $d$, simply because eventually some $B_n$ will be the $0$ matrix. But the expected value of the $n$ for which that first happens will be $2^{d^2}$.
EDIT: In fact it seems to me that as $d$ increases, the probability that $B_i$ is invertible goes rapidly to $1$. So in your example ($d=256$, $n=100$), it is extremely likely that all your matrices were actually invertible and any eigenvalues of $0$ were artifacts of round-off error in floating-point arithmetic.