Suppose that $X_{ij}$ is a set of iid Rademacher random variables with $i,j\in \mathbb{N}$. We know that by the law of large numbers, for each row of this array, $\frac{1}{n}\sum_{j=1}^{n}X_{ij}\rightarrow 0$ almost surely. Similarly, if we took an infinimum over a finite set of rows, that would also converge to $0$ almost surely. I'm wondering what happens when we take an infimum of $f(n)$ rows as $n\rightarrow\infty$. Formally, if $$Y_{n} = \inf_{i=1,\ldots,f(n)}\frac{1}{n}\sum_{j=1}^{n}X_{ij}\,,$$ then what can we say about convergence of $Y_{n}$ to a limit?
If $f(n)$ grows very slowly, $Y_{n}$ may still tend to $0$. I'm most interested in whether it's possible to get $Y_{n}$ to converge almost surely to a constant less than $0$.
Here we assume $\{X_{ij}\}$ are i.i.d. with $$P[X_{ij}=-1]=P[X_{ij}=1]=1/2$$
Define $$M_n^{(i)} = \frac{1}{n}\sum_{j=1}^n X_{ij}$$ For each positive integer $n$, $\{M_n^{(i)}\}_{i=1}^{\infty}$ is independent and identically distributed (i.i.d.). Fix $\epsilon>0$. Since $E[X_{ij}]=0$ for all $i,j$, by the Chernov-Hoeffding bound: $$P[M_n^{(i)}\leq -\epsilon] \leq e^{-n\epsilon^2/2}$$
Assume $f(n)$ is a positive integer for each $n$. Define $Y_n = \inf_{i \in \{1, ..., f(n)\}} M_n^{(i)}$. Then \begin{align} P[Y_n> -\epsilon] &= P[M_n^{(1)}> -\epsilon]^{f(n)} \\&\geq (1-e^{-n\epsilon^2/2})^{f(n)}\\ &\geq 1 - f(n)e^{-n\epsilon^2/2} \end{align} where the last inequality uses Bernoulli's inequality . Thus $$P[Y_n \leq -\epsilon] \leq f(n)e^{-n\epsilon^2/2}$$ If $f(n)$ grows sub-exponentially, the right-hand-side goes to 0 for any $\epsilon>0$.
For convergence to 0:
Suppose $f(n)$ grows slowly enough so that for all $\epsilon>0$ we have $$ \sum_{n=1}^{\infty} f(n) e^{-n\epsilon^2/2}<\infty$$ This is true, for example, for $f(n)=n^k$ for any positive integer $k$. Then by Borel-Cantelli we get $\{Y_n\leq -\epsilon\}$ finitely often (almost surely). This holds for all $\epsilon>0$ and so
$$ \liminf_{n\rightarrow\infty} Y_n \geq 0 \quad \mbox{(almost surely)} $$ On the other hand, it is easy to see that $\limsup_{n\rightarrow\infty} Y_n \leq 0$ almost surely, so we get $$ \lim_{n\rightarrow\infty} Y_n = 0 \quad \mbox{(almost surely)}$$
For convergence to -1:
We know $-1\leq M_n^{(i)}\leq 1$ and $-1\leq Y_n\leq 1$ for all $i, n$. So $$P[M_n^{(i)}=-1] = (1/2)^n \quad \forall n, i$$ Then $$P[Y_n > -1] = (1-(1/2)^n)^{f(n)}$$ In particular if $f(n)=2^n$ then $$ P[Y_n>-1] = (1-\frac{1}{2^n})^{2^n} \approx 1/e$$ where the $1/e$ approximation is accurate for large $n$. If we assume $f(n)$ grows rapidly enough to ensure $$ \sum_{n=1}^{\infty} (1-\frac{1}{2^n})^{f(n)}<\infty$$ (which is true, for example, if $f(n)=n2^n$), then $\{Y_n>-1\}$ finitely often almost surely, and $$ \lim_{n\rightarrow\infty}Y_n = -1 \quad \mbox{(almost surely)}$$