I’m looking at a discussion of the following theorem and have a few questions about it.
Theorem $1.5.7(l = 3)$. Given $k, n \in \mathbb{N}$ and $p \in[0,1]$, if $$ {n \choose 3}p^3 + {n \choose k}(1-p)^{k \choose 2} < 1$$ then $R(3, k)>n$
Better estimates:
$\cdot {n \choose 3}p^{3} \approx \frac{(n p)^{3}}{6} $
$\cdot {n \choose k}\approx 2^{H\left(\frac{k}{n}\right) n}$
$\cdot(1-p)^{k \choose 2} \approx e^{\frac{-p k^{2}}{2}}$ $\Rightarrow R(3, k)>c k$ for $c \approx 1.298$
Can We Go Further? What happens for larger $n ?$
$ \cdot {n \choose k} \geq\left(\frac{n}{k}\right)^{k}=e^{k \ln \frac{n}{k}}$
$\cdot(1-p)^{k \choose 2} > e^{-2 p {k \choose 2}}>e^{-p k^{2}}$
$\because$ need $p=\Omega\left(k^{-1} \ln \frac{n}{k}\right)$, otherwise $\left(\begin{array}{l}n \\ k\end{array}\right)(1-p)^{(k)}$ grows exponentially.
$\cdot$ But then $\left(\begin{array}{l}n \\ 3\end{array}\right) p^{3}=\Theta\left((n p)^{3}\right)=\Theta\left(\left(\frac{n}{k} \ln \frac{n}{k}\right)^{3}\right)$
$ \ \ \ $ $\cdot$ Bigger than 1 if $n>C k$ for some constant $C$.
- In the “Better estimates” section, what is $H$ in $\cdot {n \choose k}\approx 2^{H\left(\frac{k}{n}\right) n}$?
- In the “for larger $n$ section”, how do we go from $p$ is lowered bound by $\left(k^{-1} \ln \frac{n}{k}\right)$ to $\Theta\left((n p)^{3}\right)=\Theta\left(\left(\frac{n}{k} \ln \frac{n}{k}\right)^{3}\right)$? Surely $(np)^3$ would be lowered bound, but what about the upper bound part of $\Theta$?
- A bit elaboration on the statement “Bigger than $1$…” at the end would be appreciated.
- On the side note, sometimes I feel pretty uncertain reading some expressions using asymptotic notation, as the authors don’t always specify which variables tend to infinity. For example, for the quantity $p=\Omega\left(k^{-1} \ln \frac{n}{k}\right)$, should I interpret it as implicitly saying that both $n$ and $k$ tend to infinity?
- A bit further on, in finding a new bound for $R(3,k)$, we have to maximise $n- {n \choose 3} p^3 - {n \choose k}(1-p)^{k \choose 2}$. The slides read:
$\cdot$ Take $p=\Theta\left(k^{-1} \ln \frac{n}{k}\right)$ $\Rightarrow$ Want to maximise $n - \Theta\left(\left(\frac{n}{k} \ln \frac{n}{k}\right) ^3 \right)$.
$\cdot$ At maximum: $\left(\frac{n}{k} \ln \frac{n}{k}\right)^{3}=\Theta(n)$ $\Rightarrow n=\Theta\left(\left(\frac{k}{\ln \frac{n}{k}}\right)^{\frac{3}{2}}\right)=\Theta\left(\left(\frac{k}{\ln k}\right)^{\frac{3}{2}}\right)$
How do we know that $p=\Theta\left(k^{-1} \ln \frac{n}{k}\right)$ work? From the “larger $n$” section, we know that $p$ must be lower bounded by $\left(k^{-1} \ln \frac{n}{k}\right)$, at which point the third term of the expression that we need to maximise is exponentially small. But why the upper bound here? Also the last dot point is kinda lost on me.
- I also need to find an asymptotic lower bound for the general Ramsey number $R(l,k)$, where $l$ is fixed and $k \to \infty$. I’m thinking of using the idea in question $5$, but we’d use the expression $$n - {n \choose l}\left(\frac{1}{2}\right)^{l \choose 2} - {n \choose k}\left(\frac{1}{2}\right)^{k \choose 2}$$ to obtain the number of vertices in our Ramsey graph (the question doesn’t mention $p$ so I’m not sure if I should also optimize over it). I assume $n$ now only depends on $k$, we’d need to maximise $n - {n \choose k}\left(\frac{1}{2}\right)^{k \choose 2}$. Would this be a valid approach?
First, you have some details involving $\binom k2$ wrong. In the original formula, you should have $$\binom n3 p^3 + \binom nk (1-p)^{\binom k2} < 1$$ with $\binom k2$ in the exponent. In the line $(1-p)^{\left(\frac{k}{2}\right)}>e^{-2 p\left(\frac{k}{2}\right)}>e^{-p k^{2}}$, every instance of $\left(\frac k2\right)$ should actually be $\binom k2$.
Now, on to your questions.
In $\binom nk \approx 2^{H(k/n) n}$, $H$ is the entropy function $H(p) = p \log_2 \frac1p + (1-p) \log_2 \frac1{1-p}$. (This is the Shannon entropy, in bits, of a random variable which takes on two values with probability $p$ and $1-p$.) For this application, I think all you need to know is that it's some function of $\frac kn$ which is always between $0$ and $1$.
You're right: from $p = \Omega(k^{-1} \ln \frac nk)$, we can only deduce that $\binom n3 p^3 = \Omega((\frac nk \ln \frac nk)^3)$. This still implies our final conclusion: that if $\frac nk$ is large enough, $\binom n3 p^3 > 1$.
Our goal is to choose $n$ and $p$ in terms of $k$ such that $\binom n3 p^3 + \binom nk (1-p)^{\binom k2} < 1$. From the previous part, we concluded that if $p = \Omega(k^{-1} \ln \frac nk)$, then there's some constant $C$ such that if $n > Ck$, we have $\binom n3 p^3 > 1$, making the whole thing bigger than $1$, and we don't get what we want.
In general, yes, the $\Omega$ bound should be for both $n$ and $k$ tending to infinity. In this particular case, you can read $\Omega(k^{-1} \ln \frac nk)$ as "at least $c k^{-1} \ln \frac nk$ for some constant $c>0$", and that's often what's going on if people use $\Omega$ and $O$ notation with multiple variables that can go to infinity. (Actually, here, I think we could just have taken $c=1$ and saved ourselves the trouble of worrying about asymptotics.)