A real number $x_i$ is assigned to each member $i$ of the population $\{1,\ldots,N\}.$ When the index $I$ is random and uniformly distributed in this population, the variance of the random variable $x_I$ is $$ \frac 1 N \sum_{i=1}^N (x_i - \mu)^2 \text{ where } \mu = \frac 1 N \sum_{i=1}^N x_i. \tag 0 $$
A random subset $\mathscr I$ of size $n \le N$ of the population is uniformly distributed among all subsets of size $n.$
It can be shown that $$ \operatorname{E}\left( \frac{N-1}{N(n-1)} \sum_{i\,\in\,\mathcal I} (x_i - \bar x_{\mathscr I})^2 \right) = \sigma^2, \text{ where } \bar x_{\mathscr I} = \frac 1 n \sum_{i\,\in\,\mathscr I} x_i, \tag 1 $$ i.e. the expression in parentheses is an unbiased estimator of $\sigma^2.$
(“Unbiased estimator of $\sigma^2$” does not simply mean a random variable whose expected value is $\sigma^2.$ For example, $$ \frac 1 n \sum_{i\,\in\,\mathscr I} (x_i - \mu)^2 \qquad \text{(with $\mu$ as in (1))} $$ is not an unbiased estimator of $\sigma^2$ although its expected value is $\sigma^2.$ That is because it is not a statistic, i.e. you cannot know its value based only on the sample $\{x_i : i\in\mathscr I\}$ because you cannot know $\mu$ without observing the whole population.)
Line $(1)$ can also be written as $$ \operatorname{E} \left( \frac{\sum_{i\,\in\,\mathcal I} (x_i - \bar x_{\mathscr I})^2}{\sum_{i\,=\,1}^N (x_i - \mu)^2} \right) = \frac N {N-1} \cdot \frac {n-1} n. \tag 2 $$ Line $(2)$ doesn't have as much symmetry as I might like: only the numerator is random, and expectations are linear, so we can't just take a reciprocal on the right side and within the parentheses on the left side.)
Two answers (so far) to this question derive $(1).$
My question is whether there is some slick and elegant way to prove $(1),$ in contrast to the pedestrian methods used in the two answers that appear so far.
I am not sure whether this is elegant or any better than the answers under the link.
Let $(X,Y)$ be uniform on $\{x_1,\dots,x_n\}^2\setminus \{x=y\}$. Assume WLOG that $\sum_{k=1}^N x_k = 0$. Then $\mathrm E[X] = \mathrm E[Y] = 0$, $\mathrm E[X^2] = \mathrm E[Y^2] = \frac1N \sum_{k=1}^N x_k^2 = \sigma^2$.
Permute $\{x_1,\dots,x_n\}$ randomly to get $X_1,\dots,X_N$. Then $$ 0 = \mathrm E\left[\Big(\sum_{k=1}^N X_k\Big)^2\right] = \sum_{k=1}^N \mathrm{E}[X_k^2] + \sum_{j\neq k} \mathrm{E}[X_k X_j] = N\, \mathrm E[X^2] + N(N-1)\mathrm E[X Y], \tag{1} $$ whence $$ \mathrm E[X Y] = - \frac{\sigma^2}{N-1}. $$
Note that we are interested in $$ \mathrm E\bigg [\frac1n \sum_{k=1}^n(X_k- \overline X_n)^2\bigg] = \mathrm E\bigg [\frac1n \sum_{k=1}^n X^2_k- \overline X_n^2\bigg], $$ where $$ \overline X_n = \frac1n \sum_{k=1}^nX_k. $$
We have $$ E\bigg [\frac1n \sum_{k=1}^n X^2_k\bigg] = \sigma^2 $$ and, similarly to $(1)$, $$ \mathrm E\big[\overline X_n^2\big] = \frac1{n^2} \left( n\,\mathrm E[X^2] + n(n-1)\mathrm E[X Y]\right) = \frac1n \mathrm E[X^2] + \frac{n-1}{n}\mathrm E[X Y] = \frac{N-n}{n(N-1)}\sigma^2. $$
Therefore, $$ \mathrm E\bigg [\frac1n \sum_{k=1}^n(X_k- \overline X_n)^2\bigg] = \sigma^2 - \frac{N-n}{n(N-1)}\sigma^2 = \frac{N(n-1)}{n(N-1)}\sigma^2. $$