The strong law of large numbers states that for iid random variables $X_1, \ldots X_n,\ldots$, we have $$\mathbb P\left(\lim_{n\to \infty}\frac{X_1 + \ldots + X_n}{n} = \mu\right) = \mathbb P\left(\lim_{n\to\infty}Y_n = \mu\right) = 1$$ where $\mu=\mathbb E[X_i]\; \forall i$. So I considered Bernoulli random variables to see this, where $X_i \sim \text{Bernoulli}(p), p=0.5$. The sum of any $n$ Bernoulli random variables is a Binomial random variable $Y_n \sim \text{Binomial}(n,p), p=0.5$. In our case, $\mu = p = 0.5$. So as we increase $n$, we should see that $\mathbb P(Y_n = 0.5) \to 1$. I used Julia to plot the pdfs of $Y_n$ for several different $n$, by scaling the x-axis to be between 0 and 1 (since $Y_n$ is between 0 and 1). Here is the plot.
Contrary to expectations, as $n$ increases, $\mathbb P(Y_n = \mu)$ is decreasing! I'm sure I'm making some kind of conceptual mistake, but I don't see where I'm going wrong! Why is the probability of $Y_n = \mu$ going down as $n$ increases?
For reference, the code I used:
using Plots, Distributions
plot(title="PDFs of Binomial Random Variables")
for n in collect(10 .^ (1:5))
plot!(LinRange(0,1,n+1), pdf(Binomial(n,0.5)), label="n=$n")
# plot!(LinRange(0,1,n+1), cdf(Binomial(n,0.5), 0:n), label="n=$n")
end
plot!(dpi=300)
savefig("plot.png")
