Given a probability distribution $\pi(x)$ over the linear space $\mathcal{X}$ and samples $\{X_i\}_{i=1}^N$ drawn from $\pi(x)$, there is an approximation $\hat{\pi}(x) = \frac{1}{N} \sum_{i=1}^N \delta_{X_i}(x)$ (e.g. see the development of MC/particle filtering techniques given here).
My understanding is that the expectation of any test function $\phi: \mathcal{X} \to \mathbb{R}$ is approximated as $E[\phi(x)] \approx \frac{1}{N} \sum_{i=1}^N \phi(X_i)$. The author then says this expectation estimate is unbiased and has variance given by $\frac{1}{N} \left( \int \phi^2(x)\pi(x) dx - E[\phi(x)]^2 \right)$.
I don't understand the steps I need to take to derive the variance expression.