Consider the uniform distribution on $[a, b]$. There are $N$ random variables $X_1, X_2, ... , X_N$ chosen from that distribution. How can we determine the expected variance of the $N$ random variables?
I'm not sure if that's the proper terminology, so I'll explicitly mention the value I'm looking for. Let $$Y = \frac{1}{N} (X_1 + X_2 + ... + X_N)$$ I want to determine $$E\left[\frac{1}{N} \sum_{i = 1}^N (Y - X_i)^2\right]$$
Here's my progress so far \begin{align*} &= \frac{1}{N} \sum_{i = 1}^N \left( E[Y^2] + E[X_i^2] - 2E[X_i Y] \right)\\ &= E[Y^2] + E[X_i^2] - 2E[X_i Y]\\ &= E[Y^2] + E[X_i^2] - 2 \frac{N - 1}{N} E[X_i]^2 - 2\frac{1}{N} E[X_i^2]\\ &= E[Y^2] + (1 - \frac{2}{N}) \int_a^b x^2 dx - \frac{2(N - 1)}{N} \left(\frac{a + b}{2} \right)^2\\ &= E[Y^2] + \left(1 - \frac{2}{N}\right) \left( \frac{b^3 - a^3}{3} \right) - \frac{2(N - 1)}{N} \left(\frac{a + b}{2} \right)^2\end{align*}
Is there a simpler way to evaluate $E[Y^2]$ instead of computing $Y^2$ and then applying expectation on each of the terms? Am I doing everything else right?
You're on the right path. The key to the proof is to expand $Y$ as a sum and take the expectation term-by-term noting that $$ E[X_iX_j]= \begin{cases} E[X_i^2], & i=j\\ E[X_i]E[X_j], & i\neq j. \end{cases} $$ A full proof of the fact $$ E\left[\frac{1}{N}\sum_{i=1}^N(Y-X_i)^2\right]=\frac{N-1}{N}Var[X] $$ is found here, which holds for any random variable with finite variance. Once you have convinced yourself of this fact, note that $$ X=\frac{a+b}2{+X^\prime}, $$ where $X^\prime\sim\operatorname{Uniform}(-(b-a)/2,(b-a)/2)$. Thus $$ Var[X]=Var\left[\frac{a+b}2{+X^\prime}\right]=Var[X^\prime]. $$ It follows that $$ Var[X^\prime]=\int_{-(b-a)/2}^{(b-a)/2}\frac{x^2}{b-a}\,dx=\frac{((b-a)/2)^3}{3(b-a)}-\frac{(-(b-a)/2)^3}{3(b-a)}=\frac{(b-a)^2}{12}. $$ So we have $$ E\left[\frac{1}{N}\sum_{i=1}^N(Y-X_i)^2\right]=\frac{N-1}{N}\frac{(b-a)^2}{12}. $$