Let's assume that we have two random vectors:
$P = (p_1, ..., p_n), p_i \sim \operatorname{Bernoulli}(0.5)$
$Q = (q_1, ..., q_n), q_i \sim \operatorname{Uniform}(0, 1)$
Then the random variable $X_n$ is defined as follows:
$$ X_n = \sum_{i : p_i = 1}{q_i} - \sum_{j: p_j = 0}{q_j} $$
Apparently, $\operatorname{Var}[X_n] = \frac{n}{3}$ (I tried lots of simulations and this value is pretty close to the empirical value). But how to obtain this analytically? This value for the variance resembles the quantity $nE[q_i^2]$. Is there a relationship between these two quantities?
The variable $X_n$ is equal to $$\sum_{i=1}^n\left(p_i\cdot q_i -\left( 1-p_i \right) \cdot q_i \right) = \sum_{i=1}^n\left(2p_i -1\right)q_i$$
Using that the variance of a sum of $n$ i.i.d. random variables is $n$ multiplied by the variance of one of the random variables, we can simply find the variance of one term and multiply by $n$.
Let $Y = \left(2p_i-1\right)q_i$. The expected value of $Y$ is given by $E[2p_i-1]E[q_i] = \left(2E[p_i]-1\right)E[q_i] = 0$ where the expected value of the product is equal to the product of the expected value since $p_i$ and $q_i$ are independent. For $E\left[Y^2\right]$, $(2p_i-1)^2 = 1$, whether $p_i$ is $-1$ or $1$. Then $E\left[q_i^2\right] = \int_0^1 x^2 dx = \frac{1}{3}$. Therefore $E\left[Y^2\right] = E\left[(2p_i-1)^2\right] \cdot E\left[q_i^2\right]=\frac{1}{3}$.
Using that $\operatorname{Var}(Y) = E\left[Y^2\right] - E[Y]$, I get that $\operatorname{Var}(Y) = \frac{1}{3}$. Then $$\operatorname{Var}(X_n) = n\cdot \operatorname{Var}(Y) = \frac{n}{3}$$