I am reading "Large deviations for Random Graphs" by Sourav Chatterjee. The exercise (6.3) asks the following question. Let $G_{n, p}$ be the Erdős–Rényi random graph on $n$ vertices with edge probability $p$.
Prove that $\mathbb{E}(t(H, G_{n, p}))=p^{e(H)},$ and as $n\to \infty,$ $t(H, G_{n, p})$ converges to $p^{e(H)}$ in probability.
Here $H=(V(H), E(H))$ is a finite simple graph on say $k$ vertices and let $e(H)$ denote the number of edges in $H$. And, $$t(H, G_{n, p}):=\int_{[0, 1]^{|V(H)|}}\prod_{ij\in E(H)} f_{n, p}(x_i, x_j) dx_1\cdots dx_k,$$ where $f_{n, p}(x, y)$ is a symmetric measurable function called `graphon of $G_{n, p}$' that is defined as follows: For $x, y$ there exist unique $I, j$ such that $x\in (I/n, (I+1/n)]$ and $y\in (j/n, (j+1)/n]$. Define $$f_{n, p}(x, y)= 1 \text{ if } ij\in E(G_{n, p})$$ and $0$ otherwise.
I have two problems here. First of all, I can show that $\mathbb{E}(t(H, G_{n, p}))\to p^{e(H)}$ as $n\to \infty$ but I do not necessarily have an equality at each $n$ (even for large n). Secondly, I want to show that the variance of $t(H, G_{n, p})$ goes to $0$ as $n\to \infty$. But I am stuck in calculating the variance.
The function $t(H,G)$ is simply the probability that a random map $f : V(H) \to V(G)$ is a graph homomorphism from $H$ to $G$. When we make $G = G_{n,p}$ random as well, the expected value $\mathbb E[t(H,G_{n,p})]$ is also a probability: it's the probability that when we choose a random $f$ and random edges in $G_{n,p}$, $f$ becomes a graph homomorphism.
This is not exactly equal to $p^{e(H)}$, but that is the dominant term. There is a probability of $\frac{n!/(n-k)!}{n^k} = 1 - o(1)$ that $f$ will send all $k$ vertices of $H$ to distinct vertices of $G$, in which case we are looking for $e(H)$ edges of $G_{n,p}$ to all be present. All other cases can only contribute $o(1)$ to the answer.
To see that the answer does not miraculously become exact, consider the case $H = P_3$ (the path with $3$ vertices). Here, we can reason as follows:
Therefore $t(P_3,G_{n,p}) = (1-\frac1n)p\left[\frac1n + (1-\frac2n)p\right]$, which has the correct form of $p^2 + o(1)$, but is not exactly $p^2$.
For the variance calculation, let $X_f$ be the indicator variable equal to $1$ if a specific function $f$ is a graph homomorphism $H \to G_{n,p}$. Then by the law of total probability, $t(H,G_{n,p}) = \frac1{n^k} \sum_f X_f$.
Chapter 4 of Alon and Spencer's Probabilistic Method prepares you well for finding the variance of sums like these. I do not know if Chatterjee discusses this as much. But the salient points are that: