Limit in distribution for a random graph

120 Views Asked by At

I am stuck with this problem: If $R=2K_{3}\sqcup C_{4}$ and $p=\frac{c}{n}$ and if $Z$ is the number of copies of $R$ in $G(n,p)$. What is the limit in distribution of $Z$ as $n\rightarrow\infty$. Any help would be appreciated. Thanks.

1

There are 1 best solutions below

1
On BEST ANSWER

You have probably already seen how to show that if $\mathbf X_n$ is the number of copies of $K_3$ in $G(n, \frac cn)$, then $\mathbf X_n$ converges in distribution to $\mathbf X \sim \textit{Poisson}(\frac{c^3}{6})$. A similar result holds for $\mathbf Y_n$, the number of copies of $C_4$.

The number of copies of $R$ will not have a Poisson limit, because $R$ is not strictly balanced. Instead, if $\mathbf Z_n$ is the number of copies of $R$ in $G(n, \frac cn)$, then we often have have $\mathbf Z_n = \binom{\mathbf X_n}{2} \mathbf Y_n$. This has a much weirder distribution. A value like $\mathbf Z_n = 6$ is much more likely than we'd expect, because it has many representations in this form: $6 = \binom 42 \cdot 1 = \binom 32 \cdot 2 = \binom 22 \cdot 6$.

Moreover, even this is not quite right: $\mathbf Z_n < \binom{\mathbf X_n}{2} \mathbf Y_n$ when we need to exclude copies of $K_3$ or $C_4$ that share edges. So you need to argue that these don't affect the limit. (Maybe by first considering all subgraphs that are the union of two $K_3$'s and a $C_4$, and then showing that the ones that don't look like $R$ are unlikely to appear at all.)

We might hope that the limit is $\mathbf Z = \binom{\mathbf X}{2} \mathbf Y$, where $\mathbf X \sim \textit{Poisson}(\frac{c^3}{6})$, $\mathbf Y \sim \textit{Poisson}(\frac{c^4}{8})$, and $\mathbf X$ is independent from $\mathbf Y$. (That's a complete description of the distribution of $\mathbf Z$.)

To show this, we need to show that $\mathbf X_n$ and $\mathbf Y_n$ don't just have Poisson limits, but have asymptotically independent Poisson limits. With the method of moments, this is done very similarly to showing what the limits are to begin with:

  • We checked that $\mathbb E[\mathbf X_n (\mathbf X_n - 1) \dotsm (\mathbf X_n - k+1)] \to \lambda^k$ for all constant $k$ to show that $\mathbf X_n$ converges to $\textit{Poisson}(\lambda)$.
  • We now check that $\mathbb E[\mathbf X_n \dotsm (\mathbf X_n - k+1) \mathbf Y_n \dotsm (\mathbf Y_n - \ell+1)] \to \lambda^k \mu^\ell$ for all constant $k, \ell$ to show that $(\mathbf X_n,\mathbf Y_n)$ converge to independent $\textit{Poisson}(\lambda)$ and $\textit{Poisson}(\mu)$.

The way to find these expected values is also very similar. The product $\mathbf X_n \dotsm (\mathbf X_n - k+1) \mathbf Y_n \dotsm (\mathbf Y_n - \ell+1)$ is counting sequences of $k$ distinct copies of $K_3$ followed by $\ell$ distinct copies of $C_4$. You should argue that the dominant term in the expectation comes from sequences where all the $K_3$'s and $C_4$'s are edge-disjoint.