Suppose I have two discrete independant random variables $X$ and $Y$, and that I'm interested in the expected value of the random variable $W$, where: $$ W= \text{sign}(X-Y). $$ So, W is 1 if $X>Y$, -1 if $Y>X$ and 0 otherwise.
I sample the distributions of $X$ and $Y$ ten times each, giving me $\{X_1, \dots, X_{10}\}$ and $\{Y_1, \dots, Y_{10}\}$.
Consider these two ways to estimate $\text{E}\{W\}$
$$
\quad\quad\bar{W} = \frac{1}{10}\sum_{i=1}^{10} W_{i,i}, \\
\text{and, } \quad\quad
\bar{W}' = \frac{1}{100}\sum_{i=1}^{10}\sum_{j=1}^{10} W_{i,j}, \\
\text{where } \quad W_{i,j} = \text{sign}(X_i - Y_j)
$$
I know that $\text{Var}\{\bar{W}\} = \frac{1}{10}\text{Var}\{W\}$, but what is $\text{Var}\{\bar{W}'\}$, and how can I estimate it from my 20 samples?
$\def\sign{\mathop{\mathrm{sign}}} \def\Cov{\mathop{\mathrm{Cov}}} $You have an empirical distribution $P$ of $X$ and $Y$, with the probability mass function $p(x,y)=\frac1{mn}\sum_{i,j}[x=x_i][y=y_j]$, and your estimate of $\mathbb{E}[\sign(X-Y)]$ is $$ \mathbb{E}^P[\sign(X-Y)] = \frac1{mn}\sum_{i,j}\sign(x_i-y_j) = \bar W'. $$
You can calculate an estimate of variance in exactly the same way as $$ \mathbb{V}^P[\sign(X-Y)] = \mathbb{E}^P[\sign(X-Y)^2] - (\mathbb{E}^P[\sign(X-Y)])^2. $$ Since $\sign(a)^2=1$ when $a\neq0$, the first expectation is $$ Q = \mathbb{E}^P[\sign(X-Y)^2] = \mathbb{P}^P[X\neq Y] = \frac1{mn}\sum_{i,j}[x_i\neq y_j]. $$ So the sample variance is $$ \mathbb{V}^P[\sign(X-Y)] = Q-(\bar W')^2. $$
The variance of $\bar W'$ is calculated in the same way. Under the true probability distribution we have $$ \mathbb{V}\left[\frac1{mn}\sum_{i,j}\sign(x_i-x_j)\right] = \frac1{mn}\mathbb{V}[\sign(X-Y)] + \frac1{m^2n^2}\sum_{(i,j)\neq(k,l)}\Cov(W_{ij},W_{kl}). $$ The covariance between $W_{ij}$ and $W_{kl}$ is zero when $i\neq k$ and $k\neq l$, so the only nonzero covariances are $Q_X = \Cov(W_{ij},W_{il})$ and $Q_Y = \Cov(W_{ij},W_{kj})$, and both of these are independent of the indices. Therefore if we let $X,X_1,X_2$ and $Y,Y_1,Y_2$ be independent copies of $X$ and $Y$, then $$ Q_X = \Cov(\sign(X-Y_1),\sign(X-Y_2)), \qquad Q_Y = \Cov(\sign(X_1-Y),\sign(X_2-Y)), $$ and the variance of $\bar W'$ is $$ \mathbb{V}[\bar W'] = \frac1{mn}\mathbb{V}[\sign(X-Y)] + \frac{n-1}{mn}Q_X + \frac{m-1}{mn}Q_Y. $$
You can calculate these terms as expectations under the empirical distribution, so $Q_X$ for example, becomes: $$ \begin{aligned} Q_X &= \mathbb{E}^P[\sign(X-Y_1)^2\sign(X-Y_2)^2] - \mathbb{E}^P[\sign(X-Y_1)]\mathbb{E}^P[\sign(X-Y_2)] \\&= \frac1{mn^2}\sum_{i,j,k}[x_i\neq y_j, x_i\neq y_k] - \left( \frac1{mn}\sum_{i,j}\sign(x_i-y_j)\right)^2. \end{aligned}$$