Chromatic number of Random graph

513 Views Asked by At

Let $p = \frac{c}{n}, c < 1$. Consider Erdos-Renyi random graph $G(n, p)$. Prove that asymptotically almost surely $\chi(G) \le 3$.

What i tried is to show that a.a.s. all connected components in $G$ are trees or graphs with only one cycle (similar to $p = o\left(\frac{1}{n^2}\right)$ --- a.a.s. no edges, and $p = o\left(\frac{1}{n}\right)$ --- a.a.s. no cycles). I know that i should use Chebyshev inequality for some random variable to solve it, but i do not understand which to choose.

1

There are 1 best solutions below

5
On BEST ANSWER

The components you want to rule out (the ones that are neither trees nor unicyclic) are often called "complex components" in the study of random graphs. We can characterize them by saying that they have $k$ vertices (for some $k$) and at least $k+1$ edges.

As an upper bound on the expected number of complex components, we can count a more specific type of them: the number of minimal subgraphs (not necessarily induced) that contain two cycles. This is equal to $$\sum_{k=4}^n \binom{n}{k} C_k p^{k+1}$$ where $C_k$ denotes the number of (labeled) connected $k$-vertex graphs with $k+1$ edges and two cycles.

Such graphs either consist of two edge-disjoint cycles joined by a (sometimes trivial) path, or a single cycle with a path between two of its edges. In all cases, these consist of a $(k-1)$-edge path, with an edge added from each endpoint to somewhere else on the path, so we have $C_k \le k^2 \cdot k!$.

(In an earlier version of this answer, I tried to bound $C_k$ by going off of Cayley's formula, which gives $C_k = O(k^{k+2})$ and that is not good enough.)

In addition to this estimate, we bound $\binom nk \le \frac{n^k}{k!}$ and allow the sum to range over all $k \ge 1$. So we have $$\sum_{k=4}^n \binom{n}{k} C_k p^{k+1} \le \sum_{k=1}^\infty \frac{n^k}{k!} \cdot k^2 \cdot k! \cdot \left(\frac cn\right)^{k+1} = \frac 1n \cdot \sum_{k=1}^\infty k^2 \cdot c^{k+1}.$$ The value of the resulting sum no longer depends on $n$, and converges for all $c<1$, so it is $O(1)$, and the overall expression is $O(1/n)$. Now you can use what you call Chebyshev's inequality (and what I would call Markov's inequality) to show that the probability of seeing a complex component is also $O(1/n)$.