earlier I asked the question Expected distance for a gaussian variable to its nearest integer. and got a good answer. The expected distance is highly close to $1/4$, which is very similar to the uniform case.
Then I went on examine the distribution of $|X-R(X)|$, where $R(X)$ is the nearest integer of the variable.
As user Did and leonboy pointed out, for $\sigma \rightarrow 0 $, the distance are concentrated. On the other side, for $\sigma \rightarrow \infty$, $|X-R(X)|\rightarrow UNIFORM(0,0.5) $
The computer simulation I ran implies the variance doesn't need to go to infinite to see the uniformity. See the figures below.
My question is how fast does $|X-R(X)|$ converge to uniform distribution with the increases of $\sigma$.



Thanks
For $0 < y < 1/2$, $|x - R(x)| = y$ if $x = n \pm y$ for integer $n$, with $\dfrac{d}{dx} |x - R(x)| = \pm 1$ there, so the PDF of $Y = |X - R(X)|$ is $f_Y(y) = \sum_{n = -\infty}^\infty (f_X(n + y) + f_X(n-y))$ where $f_X$ is the PDF of $X$. By the Poisson summation formula, $$ \sum_{n=-\infty}^\infty f_X(n+y) = \sum_{k=-\infty}^\infty e^{2 \pi i k y} \widehat{f_X}(k)$$ where in this case $$\widehat{f_X}(k) = e^{-2 \pi^2 k^2 \sigma^2 - 2 \pi i k \mu}$$ Thus for $0 < y < 1/2$, $$ f_Y(y) = 2 + 4 \sum_{k=1}^\infty \cos(2\pi k y) \cos(2 \pi k \mu) e^{-2 \pi^2 k^2 \sigma^2} $$ The term $2$ is the uniform distribution, and the others decay rapidly as $\sigma$ increases.