Bounding the degree of very sparse random graph

2.1k Views Asked by At

I am confused with how to manipulating with big O notation ,here is a problem from section 2.4(Exercise 2.4.3) high dimensional probability by Roman Vershynin

Consider a random graph $G \sim G(n,p)$ with expected degree $d=O(1)$. Show that with high probability , all vertices of $G$ have degrees $$O (\frac{\log n}{\log \log n})$$

I want use Chernoff bound below to bound the degree of vertices:

$$Pr\left[| X-\mu|\geq \delta \mu \right] \leq 2e^ {-c\mu\delta ^2}$$

where c is a constant $\DeclareMathOperator*{\E}{\mathbb{E}} \delta \in [0,1), \mu = \E {X}$ and X is binomial random variable.

how should I do that ?

2

There are 2 best solutions below

6
On BEST ANSWER

The degree distribution of a node is $\text{Binomial}(n-1, p)$. Here, the mean is constant, so it is a very skewed binomial distribution (approximately Poisson), and the usual Chernoff bound does not do well in such cases. You can try it, but it will not work.

Instead, we can bound the probability that a node has degree at least $k$ by $\binom{n-1}{k}p^k$: the union bound, over all sets of $k$ nodes, that all nodes in the set are adjacent to the given node. So if $X_{\ge k}$ denotes the number of nodes with degree at least $k$, we have $$ \mathbb E[X_{\ge k}] \le n \binom{n-1}{k}p^k \le n \left(\frac{(n-1)e}{k}\right)^k p^k = n \left(\frac{de}{k}\right)^k = \exp\Big(\log n - k \log k + O(k)\Big). $$ It's pretty common to try to find an asymptotic inverse of something like $f(x) = x \log x$: it is $f^{-1}(x) = (1+o(1)) \frac{x}{\log x}$. Here, to get $\mathbb E[X_{\ge k}] = o(1)$, we want $k \log k$ to outgrow $\log n$, so we should be looking at a value of $k$ like $\frac{\log n}{\log \log n}$.

But if we use $k = \frac{\log n}{\log \log n}$, then we get $k \log k = \log n \cdot (1 - \frac{\log \log \log n}{\log\log n}) < \log n$, which isn't good enough.

Instead, set $k = \frac{2 \log n}{\log \log n}$ (any constant bigger than $1$ would do, actually). Now we get:

  • $k \log k = k(\log 2 + \log\log n - \log\log \log n) = (1+o(1)) k \log \log n = (1+o(1)) 2 \log n$.
  • Since $k = o(\log n)$, we have $k \log k + O(k) = (1+o(1)) 2\log n$ as well.
  • Finally, $\log n - k \log k + O(k)$ becomes $-(1+o(1))\log n$, which goes to $-\infty$ with $n$, so $\exp(\log n - k \log k + O(k)) = o(1)$.
0
On

Just use the theorem 2.3.1, which is the usual Chernoff's inequality about upper bound can work, we can get almost the same result like the first answer, which is $t \mathrm{log}(t) + O(t) \geq \mathrm{log}(n) - \epsilon$, just show $t = O(\mathrm{log}(n)/\mathrm{log}\mathrm{log}(n)) $