Let $x_n$ be a sequence of real numbers that asymptotically follow a uniform distribution modulo $1$ (for example, $x_n=n\sqrt{2}$ with $n$ positive integer $\leq N$ and $N\rightarrow \infty$). It is well known that the average value of $\{x_n\}$, calculated for all $n \leq N$, tends to $1/2$ as $N\rightarrow \infty$. Similarly, the average value of $\{x_n\}^2$ tends to $1/3$. The convergence to this value is often described by a probabilistic approach (e.g. citing the central limit theorem, the progressive tendency to the standard normal CDF, and so on).
I am interested in describing this convergence rate using a Big-$O$ notation, to point out its asymptotic behaviour in more direct way. In particular, writing
$$\frac1N \sum_{1\leq n\leq N} \{x_n\}= \frac{1}{2}+R(N)$$
and
$$\frac1N \sum_{1\leq n\leq N} \{x_n\}^2= \frac{1}{3}+S(N)$$
I would like to know the magnitude of the error terms $R(N)$ and $S(N)$ as $N \rightarrow \infty$. By taking into account the CLT, the properties of the Irwin-Hall distribution and the Berry–Esseen theorem, it seems that both errors have magnitude $O(N^{-1})$. However, I would be happy to obtain a formal proof of this.