I'm repeating an action every ten minutes; it occurs the $10n$'th minute of every hour. If instead I repeat it every $\{10, 11\}$ minutes, chosen 50/50, every minute of the hour becomes equally likely in the limit. Why?
Formally: let $k$ be given, and let $X_i$ be uniformly distributed on a subset of $\{0, \ldots, k-1\}$ such that its support contains two values $d_a$ and $d_b$ with $gcd(d_b - d_a, k) = 1$. Let $Y_j = (\sum_{i=1}^j X_i)\ \textrm{mod}\ k$. Then I conjecture that the distributions of $Y_n$ converge to a uniform distribution on $\{0, \ldots, k-1\}$ as $n$ goes to infinity. Is this true? Why?
Also, can the uniformity requirement on $X_i$ be relaxed? What does convergence of a function (e.g. the probability density function of $Y_j$) even mean? I guess since it's non-zero on only finitely many points, a pointwise convergence is what I want? My own proof ideas focus on the CLT, and maybe since its density function is (uniformly?) continuous, if I increase $n$ enough I increase the spread, so the difference of probability mass across $k$ consecutive mod-buckets can be bounded arbitrarily close to 0. Is this a fruitful path?
Let us start with the easiest case, where $P(X_i = 0) = P(X_i = 1) = 1/2.$ Define $Z_j = \sum_{i=1}^k X_i$. In this case $Z_j$ is binomial distributed with $j$ trials and success probability $1/2$. Assume for simplicity that $k$ divides $j+1$. Therefore, for all $0\leq a<k$ $$ P(Y_j=a) = \sum_{i=0}^{ j/k} P(Z_j = ki + a) = 2^{-j} \sum_{i=0}^{ j/k} \binom{j}{ki+a}. $$ Now, using $$ \sum_{i=0}^{j/k} \binom{j}{ki+a}= \frac{1}{k}\sum_{i=0}^{k-1}\omega^{-i\,a} \big(1+\omega^i\big)^{j},$$ where $\omega$ is the $k$-th (complex) root of unity (this is proven for example in lacunary sum of binomial coefficients), we obtain $$ P(Y_j=a) = \sum_{i=0}^{ j/k} P(Z_j = ki + a) = \frac{1}{k}\sum_{i=0}^{k-1}\omega^{-i\,a} \frac{\big(1+\omega^i\big)^{j}}{2^j}. $$ In the latter sum, all terms with $i\neq0$ converge to zero (as the base in the numerator has an absolute value of strictly less than 2) and thus $$ P(Y_j=a) \to\frac{1}{k}, $$ for $j\to\infty$ as desired.
I am unsure however, whether such a combinatorial argument can easily be extended to arbitrary subsets. I believe the CLT might be helpful, but I don't know, how to apply it exactly.
Regarding your question about the meaning of convergence of random variables - there are several definitions, which are presented on, e.g., https://en.wikipedia.org/wiki/Convergence_of_random_variables. In your case, we speak of convergence in distribution, i.e., the probability masses converge to $P(Y_j=a) \to 1/k$ for all $0\leq a<k$, when $j \to \infty$. There are stronger forms of convergence, which would need to be proven separately.