I want to express the excepted size of the component for a randomly chosen node in the random Erdos-Renyi graph $G(n,p)$. I already read this thread, but I want to express it asymptotically.
Let $p = \dfrac{c}{n}$. Let $S$ be the random variable which gives us the size of the considered node.
The results I want to show (as $n \rightarrow \infty$) :
- For $c<1$ : $\Bbb E[S] < \log(n)$
- For $1<c < \log (n)$ : $\Bbb E [S] = qqn + (1-q)\log(n)$
- For $c>1$ : $\Bbb E[S] = n$
Where $ q $ denotes the fraction of nodes in the giant component of the graph.
The results I already know:
If $np < 1$, then a graph in $G(n, p)$ will almost surely have no connected components of size larger than $O(\log(n))$.
If $np = 1$, then a graph in $G(n, p)$ will almost surely have a largest component whose size is of order $n^{2/3}$.
If $np → c > 1$, where $c$ is a constant, then a graph in $G(n, p)$ will almost surely have a unique giant component containing a positive fraction of the vertices. No other component will contain more than $O(\log(n))$ vertices.
If $p<\tfrac{(1-\epsilon)\ln n}{n}$, then a graph in $G(n, p)$ will almost surely contain isolated vertices, and thus be disconnected.
If $p>\tfrac{(1+\epsilon) \ln n}{n}$, then a graph in $G(n, p)$ will almost surely be connected.
Thus $\tfrac{\ln n}{n}$ is a sharp threshold for the connectedness of $G(n, p)$.
(Quote of the "Did"'s answer here)
What I tried: (the case $c<1$) We note $ k_n = a\log(n) $, with $a$ verifying $ \Bbb P(|C_{max}| \geq a\log(n)) \rightarrow 0$ where $|C_{max}|$ is the size of the biggest component. $$\begin{align} \Bbb E[S] &= \sum_{k=1}^n k\Bbb P(S=k)\\ & = \sum_{k=1}^{\lfloor k_n \rfloor} k\Bbb P(S=k) + \sum_{k\geq k_n}^n k\Bbb P(S=k) \\ & \leq \sum_{k=1}^{\lfloor k_n \rfloor} k\Bbb P(S=k) + n\times \sum_{k\geq k_n}^n\Bbb P(S=k)\\ & \leq \sum_{k=1}^{\lfloor k_n \rfloor} k\Bbb P(S=k) + n\times \Bbb P(|C_{max}| \geq a\log(n)) \end{align}$$
I don't know how to continue.
I would recommend the following book: https://www.math.cmu.edu/~af1p/BOOK.pdf, Chapter 2 regarding the evolution of random graphs: it is all there and it is explained very well.
Regarding connectedness, Chapter 4 of that book or, page 265 of https://www.cs.cornell.edu/jeh/book.pdf, which is also an excellent resource.
In particulat for $c<1$, $E(S)=O(\log n)$ is theorem 2.9 which is broken down in 3 different theorems to show that with high probability for $c<1$ then $E(S)=O(\log n)$.
Now Theorem 2.10 shows that with high probability, for $c<1$ every component in the random graph has at most one cycle.
Then Theorem 2.11 shows that indeed if you check the size of components with exactly one cycle, with high probability the number of vertices on those components is $O(w(n))$, with $w(n)$ being any growing function $w$, hence the size of components with exactly one cycle are if you will $o(\log \log n)$.
It remains then to check the order of components with no cycles, i.e. trees.
Now Theorem 2.12 is a bit more general, since it works for any $c \neq 1$, but in particular it works for $c<1$, which is what you want.
In there they show that with high probability the largest component in a random graph is an isolated tree with number of vertices $O(\log n)$. How do they show it? They show first that with high probability there is an isolated tree with $k_{-}$ vertices, where $k_{-} \leq a \log n$ (see details in the book). But is also shows that there is no isolated tree of order at least $k_{+}>k_{-}$, where $k_{+} \leq b \log n$, for some $b>a$, hence the order of the largest component is with high probability a tree of order $O(\log n)$.