Let $ G_{n,p} $ be a graph with n vertrices, such that each possible edge has a probability of $p$ to exist (there are $ \binom{n}{2} $ possible edges, each one appears in the graph with probability $p$). Next, let $p= \frac{d}{n} $ where $d$ is a constant (positive, such that $p$ is well defined). Prove that with high probability, $ G_{n,p}$ contains a vertex of degree of at least $ \left(\ln n\right)^{\frac{1}{2}} $.
By "with high probability", I mean that the probability of $ G_{n,p} $ to contain such vertex converge to $1$ when $ n \to \infty$
I dont have an idea right now of how to start. Any help would be appreciated.
Thanks in advance
To prove existence, we'll want to use the second moment method. Here, we will have to notice that the degrees of different vertices are almost independent, and we'll need tight bounds on the tail of the binomial distribution. The proof can be written with the following steps:
Using the relative entropy (or Kullback-Leibler divergence) between the Bernoulli(a) and Bernoulli(p) distribution: $$\begin{align} D(a\parallel p) &= a \log\frac{a}{p}+(1-a)\log\frac{1-a}{1-p} \\ \ \end{align} $$ with $p=\frac{d}{n}$ and $a=\frac{K}{n}$ we get: $$\begin{align} D(\frac{K}{n} \parallel \frac{d}{n}) &= \frac{K}{n}\log\left(\frac{\frac{K}{n}}{\frac{d}{n}}\right)+(1-\frac{K}{n})\log\left(\frac{1-\frac{K}{n}}{1-\frac{d}{n}}\right) \\ \ \\ \end{align} $$
Using it to express the chernoff bound we can get the following lower and upper tail bounds on the upper tail of the CDF of the binomial distribution (here, $K \gg np$): $$\begin{align} \frac{1}{\sqrt{2 n}} \exp{\left(-n D(\frac{K}{n} \parallel \frac{d}{n})\right)} \leq E[X_i] &= P[B_i\ge K] \leq \exp{\left(-n D(\frac{K}{n} \parallel \frac{d}{n})\right)} \ \\ \end{align} $$
Assuming $ n \gg K \gg d $ and using some calculus we get the following approximations and limits: $$\begin{align} D(\frac{\sqrt{\log(n)}}{n} \parallel \frac{d}{n}) &= \frac{\sqrt{\log(n)}}{n} \left(\log\left(\frac{\sqrt{\log(n)}}{d}\right) - 1\right) + \frac{d}{n} + \mathcal{O}\left(\frac{1}{n^2}\right) \\ \lim_{n \to \infty} D(\frac{\sqrt{\log(n)}}{n} \parallel \frac{d}{n}) &= \lim_{n \to \infty} D(\frac{K}{n} \parallel p) = 0 \\ \lim_{n \to \infty} n D(\frac{K}{n} \parallel p) &= \lim_{n \to \infty} K \log K + \mathcal{O}(K) = \infty \\ \lim_{n \to \infty} \exp\left(-n D(\frac{K}{n} \parallel p)\right) &= 0 \\ \lim_{n \to \infty} \frac{1}{2}\log n - K \log K + \mathcal{O}(K) &= \infty \\ \lim_{n \to \infty} \frac{n}{\sqrt{2 n}} \exp\left(-n D(\frac{K}{n} \parallel p)\right) &= \lim_{n \to \infty} \exp\left(\log\left(\sqrt{\frac{n}{2}}\right) -n D(\frac{K}{n} \parallel p)\right) = \infty \\ \end{align} $$ and with the bounds above we get: $$\begin{align} \lim_{n \to \infty} E[X_i] &= \lim_{n \to \infty} P[B_i\ge K] = 0 \\ \lim_{n \to \infty} \mu_x &= \lim_{n \to \infty} nE[X_i] = \infty \\ \ \\ \end{align} $$
Using the law of total probability, and the fact that $B_i^j,B_j^i,e_{ij}$ are independent random variables, approximate: $$\begin{align} E[X_i X_j] &= P[B_i\ge K , B_j\ge K] \\ &= P[B_i\ge K \land B_j\ge K \| e_{ij}=1]\color{brown}{P[e_{ij}=1]} + P[B_i\ge K \land B_j\ge K \| e_{ij}=0]\color{brown}{P[e_{ij}=0]} \\ &= P[B_i^j\ge K-1 \land B_j^i\ge K-1 \| e_{ij}=1]\color{brown}{P[e_{ij}=1]} + P[B_i^j\ge K \land B_j^i\ge K \| e_{ij}=0]\color{brown}{P[e_{ij}=0]} \\ &= P[B_i^j\ge K-1 ]P[ B_j^i\ge K-1 ]\color{brown}{P[e_{ij}=1]} + P[B_i^j\ge K ]P[ B_j^i\ge K ]\color{brown}{P[e_{ij}=0]} \\ &= P[B_i^j\ge K-1 ]P[ B_j^i\ge K-1 ]\color{brown}{p} + P[B_i^j\ge K ]P[ B_j^i\ge K ](\color{brown}{1-p}) \\ &= P[B_i\ge K ]^2 (1+\mathcal{o}(1)) \color{brown}{p} + P[B_i\ge K ]^2 (1+\mathcal{o}(1)) (\color{brown}{1-p}) \\ &= P[B_i\ge K ]^2 (1+\mathcal{o}(1)) ( \color{brown}{p}+\color{brown}{1-p}) \\ &= E[X_i]^2 (1+\mathcal{o}(1)) \xrightarrow[n\to\infty]{} 0 \\ \ \end{align}$$
Approximate the second moment: $$\begin{align} E[X^2] &= E[(\sum_{i=1}^{n} X_{i})^2] \\ &= \sum_{i=1}^{n} E[X_{i}^2] + \sum_{i,j=1\\i\neq j}^{n} E[X_{i}X_{j}] \\ &= nE[X_{i}] + \sum_{i,j=1\\i\neq j}^{n} E[X_{i}X_{j}] \\ &= \mu_x + n (n-1) E[X_{i}X_{j}] \\ &= \mu_x + n^2 E[X_i]^2 (1+\mathcal{o}(1)) \\ &= \mu_x + \mu_x^2 (1+\mathcal{o}(1)) \\ \end{align}$$
Using the second moment method: $$\begin{align} P[X \ge 1] & \geq \frac{(E[X])^2}{E[X^2]} \\ &= \frac{\mu_x^2}{\mu_x + \mu_x^2 (1+\mathcal{o}(1))} \xrightarrow[n\to\infty \implies \mu_x \to\infty]{} 1 \\ \ \end{align}$$
With high probability, at least one vertex with such degree exists.