Questions about Hamiltonicity of random graphs.

64 Views Asked by At

I’m reading a proof on the Hamiltonicity of a random graph, and there’s a few details that I’m not clear about. Here’s the setup and argument:

Let $G$ be a graph on $[n]$, $\alpha \in (0,1)$, and $R = \{e_1,e_2,\dots,e_{|R|}\}$ be a collection of random edges chosen uniformly and randomly from all subsets of $[n] \choose 2$. We assume:
$\ \ \bullet$ the longest path in $G$ has length at least $l(n)$
$\ \ \bullet$ $(|R|\cdot\alpha^2) \to \infty$
$\ \ \bullet$ $|R| \geq \frac{100(n - l(n))}{\alpha^2}$
$\ \ \bullet$ $G$ is connected.

For $t \in [|R|]$, let the random variable $Y_t = 1$ if $e_t = \{x,y\}$ and there exists a longest path in $G_t := G_{t-1} \cup e_t$, with endpoints $x,y$ (and $Y_t = 0$ otherwise). Some further analysis (which is omitted here) shows that $\mathbb{P}(Y_t = 1) \geq \alpha^2$.

Observation: if $\sum_{t=1}^{|R|} Y_t \geq n - l(n)$, then $G \cup R$ is Hamiltonian.

Also, since $\sum_{t=1}^{|R|} Y_t$ is dominated by a $Bin(|R|,\alpha^2)$ random variable, we have:

\begin{align} &\mathbb{P}(\text{$G \cup R$ not Hamiltonian}) \\ \leq \ &\mathbb{P}(Bin(|R|,\alpha^2) \leq n - l(n)) \\ \leq \ &\mathbb{P}(Bin(|R|,\alpha^2) \leq \frac{|R| \cdot \alpha^2}{100}) = o(1) \end{align}

My questions are as follows:

  1. I want to make sure I understand the statement: $$\sum_{t=1}^{|R|} Y_t \geq n - l(n) \Rightarrow G \cup R \text{ is Hamiltonian}$$ If we look at the sequence of graphs $G := G_0, G_1, G_2,\dots, G_{|R|}$, i.e. we throw the edges of $R$ on $G$ one by one, then the LHS condition implies that we have closed the longest paths in those graphs at least $n - l(n)$ times, with each iteration lengthens the length of the longest path by $1$?

  2. Why do we have: $\sum_{t=1}^{|R|} Y_t$ is dominated by a $Bin(|R|,\alpha^2)$ random variable?

  3. Lastly, how do we get the $o(1)$ at the end?

1

There are 1 best solutions below

0
On BEST ANSWER

Yes, you are right about what the proof means by $$ \sum_{t=1}^{|R|} Y_t \geq n - l(n) \implies G \cup R \text{ is Hamiltonian}. $$ Each time $Y_t = 1$, we either increase the length of the longest path by $1$, or turn a path of length $n-1$ into a Hamiltonian cycle. (Or, if $Y_t = 1$ and $G_{t-1}$ is already Hamiltonian, we do nothing.) Taking at least $n - l(n)$ such steps must result in a Hamiltonian cycle.


The sum $\sum_{t=1}^{|R|} Y_t$ is "dominated by $\text{Bin}(|R|,\alpha^2)$" in the following sense. (I would actually say "dominates" rather than "is dominated by".)

It's not just true that $\mathbb P(Y_t = 1) \ge \alpha^2$.It's also true that for any event $A$ defined in terms of $Y_1, \dots, Y_{t-1}$, $\mathbb P(Y_t = 1 \mid A) \ge \alpha^2$. This conditional probability is also key to the proof, and I hope that it is explained in the step you omitted. (Essentially, as long as $G$ is an expander graph with certain parameters, then $G_1, \dots, G_t$ must have this property, regardless of $R$.)

So we can define a sequence $(Y_t')$ with the following properties:

  1. Independently of $Y_1', \dots, Y_{t-1}'$, $\mathbb P(Y_t' = 1) = \alpha^2$.
  2. If $Y_t = 0$, then $Y_t' = 0$ as well.

Essentially, whenever $Y_t = 1$, set $Y_t' = 1$ with the correct probability (depending on all our previous history) so that the overall $\mathbb P(Y_t' = 1)$ in this situation is $\alpha^2$.

Now we have $\sum_{t=1}^{|R|} Y_t \ge \sum_{t=1}^{|R|} Y_t'$ by property 2 above, and this second sum is $\text{Bin}(|R|,\alpha^2)$ exactly. Therefore $$ \mathbb P\left(\sum_{t=1}^{|R|} Y_t < n - l(n)\right) \le \mathbb P\left(\sum_{t=1}^{|R|} Y_t' < n - l(n)\right) $$ and for this second sum, we can use binomial probability techniques.


Let $Y' = \sum_{t=1}^{|R|} Y_t'$; by the above, we have $Y' \sim \text{Bin}(|R|,\alpha^2)$. We want to bound $\mathbb P(Y' < n - l(n))$.

Well, we have $\mu := \mathbb E(Y') = \alpha^2 |R| \ge 100 (n - l(n))$ by one of the bulleted assumptions made at the beginning, and so it's enough to bound $\mathbb P(Y' < 0.01 \mu)$. By a Chernoff bound, $$ \mathbb P(Y' < 0.01 \mu) \le \left(\frac{e^{-0.99}}{0.01^{0.01}}\right)^\mu \le 0.39^\mu. $$ Since $\mu = \alpha^2 |R| \to \infty$ (another one of the bulleted assumptions), $0.39^\mu \to 0$, and so $\mathbb P(Y' < n - l(n)) = o(1)$.