Let $(K_i)_{i\in\Bbb N}$ be a family of i.i.d. random variables such that $K_1$ has geometric distribution. For every $i\in\Bbb N$ let $(I_j^i)_{j\in \Bbb N}$ be a family of random variables with values in $\{ 0 , 1\}$. I have no further Information on this family. Assume that
$$\frac 1 n \sum_{i=1}^n \sum_{j=1}^{K_i} I_j^i\to 0, \quad n\to\infty $$
almost surely.
My question:
Let $(M^i_j)_{i,j\in\Bbb N}$ be a family of i.i.d random variables such that $M^1_1$ has geometric distribution and the family is independent from all the other randomness above.
Is $ A_n := \frac 1 n \sum_{i=1}^n \sum_{j=1}^{K_i} M^i_j I_j^i$ still going to zero?
I have the feeling that due to the independence $A_n$ behaves as $\frac 1 n \sum_{i=1}^n \sum_{j=1}^{K_i} m I_j^i$, where $m := \Bbb E[M_1^1]$. But I see no way to establish it rigorously.
Setup:
Assume
$\{K_i\}_{i=1}^{\infty}$ are i.i.d. random variables that take nonnegative integer values and have finite mean $E[K]$.
$\{M_j^i\}_{i,j \in \mathbb{N}}$ are i.i.d. random variables that take nonnegative values and that have finite mean $E[M]$. Assume $\{M_j^i\}$ are independent of $\{K_i\}$.
$\{I_j^i\}_{i,j \in \mathbb{N}}$ are binary-valued random variables with arbitrary dependencies between themselves and between the $\{M_i^j\}$ and $\{K_i\}$ variables.
Define \begin{align} A_n &= \frac{1}{n}\sum_{i=1}^n \sum_{j=1}^{K_i} I_j^i \\ B_n &= \frac{1}{n}\sum_{i=1}^n \sum_{j=1}^{K_i} M_j^iI_j^i \end{align}
Claim:
If $A_n\rightarrow 0$ with probability 1, then $B_n\rightarrow 0$ with probability 1.
Proof:
Fix $\epsilon>0$. Let $M=M_1^1$. Since $E[M]<\infty$ there is a $c>0$ such that $E[M 1_{\{M>c\}}] \leq \epsilon$, where $1_{\{M>c\}}$ is an indicator function that is $1$ if $M>c$, and $0$ else. Observe that for all $i,j$ we have $$ 0\leq M_j^i \leq c + M_j^i 1_{\{M_j^i>c\}}$$ Thus for all $n$ we have \begin{align} 0&\leq B_n \\ &\leq \frac{1}{n}\sum_{i=1}^n \sum_{j=1}^{K_i} (c +M_j^i 1_{\{M_j^i>c\}}) I_j^i\\ &=c A_n + \frac{1}{n}\sum_{i=1}^n \sum_{j=1}^{K_i}M_j^i 1_{\{M_j^i>c\}}I_j^i\\ &\leq c A_n + \frac{1}{n}\sum_{i=1}^n \sum_{j=1}^{K_i}M_j^i 1_{\{M_j^i>c\}} \end{align} Taking $n\rightarrow \infty$ gives \begin{align} 0 \leq \limsup_{n\rightarrow\infty} B_n \leq 0 + E[K]E[M 1_{\{M>c\}}] \quad \mbox{(w.p.1)} \end{align} where we have used the fact that $A_n\rightarrow 0$ with probability 1 together with the law of large numbers. Thus, with probability 1 we have: $$ 0 \leq \limsup_{n\rightarrow\infty} B_n \leq \epsilon E[K] $$ This is true for all $\epsilon>0$ and so $B_n\rightarrow 0$ with probability 1. $\Box$