Consider the set $\{-K,-K+1,\dots,0,1,\dots,K\}$. Consider the random variable $a$ which picks integers from the set uniformly. We expect the mean of $a$ to be $0$.
How many samples we need for the mean to converge to range $[-\delta,\delta]$ with probability $p$?
Let us denote $$ a_n=\frac1n\sum_{k=1}^nX_k, $$ where $X_k$'s are iid random variables with the uniform distribution on $\{-K,\ldots,K\}$. Using Chebyshev's inequality, $$ P(|a_n|>\delta)\le\frac{\operatorname{Var}a_n}{\delta^2} $$ or, equivalently, $$ P(|a_n|\le\delta)\ge1-\frac{\operatorname{Var}a_n}{\delta^2}. $$ Since $$ \operatorname{Var}a_n=\frac1{n^2}\cdot n\cdot \operatorname{Var}X_1=\frac1n\cdot\frac{(2K+1)^2-1}{12}, $$ we have that $$ P(|a_n|\le\delta)\ge1-\frac1{\delta^2}\cdot\frac1n\cdot\frac{(2K+1)^2-1}{12}. $$ By choosing $n$ large enough, we can make the right side of the inequality above greater or equal to $p$, which is the desired probability.
This is only a bound, but I hope that this is useful.