Clique numbers and Theorem 4.5.1 in "The Probabilistic Method" by Alon and Spencer

299 Views Asked by At

My question is "What is the precise formulation of the following theorem from Alon and Spencer's book The Probabilistic Method?"

For context,

  1. Let $G(n,1/2)$ be the probability space of random graphs where edges are chosen uniformly and independently with probability $1/2$.

  2. The quantity $f(k)$ is the expected number of cliques of size $k$ amongst elements of $G(n,1/2)$, i.e. $f(k) = {n \choose k}2^{-{k\choose 2}}$.

  3. The authors note that if $k \sim 2\log_2n$, then the function $f(k)$ "drops under one," whatever that means, and $f(k)$ is "very roughly like $n^k2^{-k^2/2}$," whatever that means.

  4. If $G$ is a graph, let $\omega(G)$ be the size of the largest clique in $G$, (i.e. the size of the vertex set corresponding to the largest complete subgraph of $G$).

Here's the theorem:

Theorem 4.5.1 Let $k = k(n)$ satisfy $k \sim 2\log_2n$ and $f(k) \to \infty$. Then almost always $\omega(G) \ge k$.

What I (think I) understand already:

  1. $k\sim 2\log_2n$ means $k(n)/2\log_2n\to 1$ as $n\to\infty$.

  2. "almost always $\omega(G)\ge k$" means that for each $\epsilon > 0$ there is an $N = N(\epsilon)$ such that if $n\ge N$ then for any $G(n,1/2)$, the probability of the event $\{\text{$G\in G(n,1/2)$ satisfies $\omega(G)\ge k$}\}$ is greater than $1-\epsilon$.

What I do not understand:

  1. How can we assume that $f(k)\to \infty$ as $n\to\infty$ if we assume that $k\sim 2\log_2n$? By point 3 above in the context I gave, assuming that $k\sim 2\log_2n$ means "$f(k)$ drops under one," so how can $f(k)$ also go to infinity? (Based on this point, an appropriate answer to my question will have to address what is being said precisely in point 3 from the context.)

Many thanks.

1

There are 1 best solutions below

4
On BEST ANSWER

A more precise statement of the theorem would be

Let $k(n)$ be a function of $n$ such that $k(n) \sim 2 \log_2 n$ as $n \to \infty$ (that is, $\lim\limits_{n \to \infty} \frac{k(n)}{2 \log_2 n} = 1 )$ and $f(k(n)) \to \infty$ as $n \to \infty$. Then $$\lim_{n \to \infty} \left(\Pr[\omega(G) \ge k(n)]\right) = 1$$ where $G$ has the $G(n,\tfrac12)$ distribution.

So you seem to understand the theorem correctly.

To clarify the statement that seems to be confusing you: saying

The function $f(k)$ drops under one at $k \sim 2 \log_2 n$

means that if we let $k^*(n)$ be the least value $k$ (for a fixed value of $n$) such that $f(k) < 1$, then $k^*(n) \sim 2 \log_2 n$.

There are lots of functions $k(n)$ satisfying $k(n) \sim 2 \log_2 n$. Some of them will satisfy $f(k(n)) < 1$ for all $n$, and some others will have $f(k(n)) \to \infty$ as $n \to \infty$. In fact, for $k$ in this range, $f(k)$ changes so rapidly that we can define a function $k(n)$ with $f(k(n)+1) \to \infty$ as $n\to\infty$ and yet $f(k(n)-1) \to 0$ as $n\to\infty$. The theorem requires that we pick a "going-to-$\infty$" kind of $k$.

Also, the statement

$f(k)$ is very roughly like $n^k 2^{-k^2/2}$

is, just as it says, a very rough asymptotic estimate: it's not accurate to within a $1+o(1)$ factor or even to within a constant factor, but it's close to the truth. More precisely, going from $\binom nk$ to $n^k$ and from $2^{-\binom k2}$ to $2^{-k^2/2}$ both drop a factor which is an exponential function of $k$: much less significant than either of the factors we keep.

We could make the statement precise by saying something like $\log f(k) \sim \log (n^k 2^{-k^2})$ if $k \sim 2 \log_2 n$ (both as $n \to \infty$), but that's missing the point: the reason we make this estimate is to get a rough idea of how big $f(k)$ is before we analyze it more precisely.