Say we have i.i.d random variables $x_i$ whose mean and variance are $1$. Then the sample $s_n=\frac{1}{n}\sum_{i=1}^n x_i$ has mean $1$ and variance $\frac{1}{n}$. If we are given a small enough positive $\epsilon$, e.g., $\epsilon\in(0,0.5)$, what bound can we get for $P(0<s_n<\epsilon)$ for all $n$?
For example, if we consider the Chebyshev inequality, we have \begin{align} P(0<s_n<\epsilon)&\leq P(|s_n-1|\geq 1-\epsilon)\\ &=P(|s_n-1|\geq \sqrt{n}(1-\epsilon)\cdot\frac{1}{\sqrt{n}})\\ &\leq\frac{1}{n(1-\epsilon)^2} \end{align} This bound is rather loose because of the first inequality. I also checked other inequalities, like Hoeffding's, yet they are all bounds for the form of $P(|s_n-1|\geq a)$. If we think about $x\sim\mathcal{N}(1,1)$, then $s_n\sim\mathcal{N}(1,\frac{1}{n})$. Thus,$$P(0<s_n<\epsilon)<\epsilon\frac{\sqrt{n}}{\sqrt{2\pi}}\exp\left(-\frac{n(1-\epsilon)^2}{2}\right)<t_0\frac{\sqrt{n}}{\exp(n)}\epsilon,$$ where $t_0$ is the constant coefficient.
I wonder what bounds, similar to the above that decreases to $0$ as $\epsilon\to 0$, can be obtained? Perhaps some assumptions shall be placed. Thanks.