The following is meant to be solved with the Probabalistic Method:
For $n \in \mathbb{N}_0$, let $S_n$ be the set of all $0$-$1$ sequences of length $n$; in particular $S_0$ has the empty sequence $\emptyset$ as its only element.
For a constant $p \in [0, 1]$ consider the infinite random graph $G(p)$ given via $V := \cup_{n \in \mathbb{N}_0} S_n$ in which, independently for all $0$-$1$ sequences $\sigma \in S_n, \tau \in S_{n+1}$ such that $\sigma$ is the subsequence of $\tau$ consisting of its first $n$ entries, there is an edge in $G(p)$ between $\sigma$ and $\tau$ with probablity $p$. Denote the component of $T(p)$ that contains the vertex $\emptyset$, by $C$.
a) If $p < 1/2$, show that $$\mathbb{P}[C \text{ has infinitely many vertices }] = 0.$$ b) If $p > 1/2$, show that there exist constants $c > 1$ and $q > 0$ such that for each $n \in \mathbb{N}_0$ holds $$\mathbb{P}[C \text{ has at least $c^n$ vertices in $S_n$ }] \ge q.$$
I think that the Markov and Paley-Zygmund inequalities might be useful for this. The natural way to proceed would then be to let $X$ be the number of vertices in $C$.
For a) I have the following idea: Let $X_n$ be the number of vertices of $C$ in $S_n$. Thenwe have by Markov's inequality and linearity of expectation that
$$\mathbb{P}[X \ge a] \le \frac{\mathbb{E}[X]}{a} = \sum_{n^\ge 1} \frac{\mathbb{E}[X_n]}{a}$$
But I do not see how to compute the expectations $\mathbb{E}[X]$ and $\mathbb{E}[X_n]$. Could you please give me a hint?
For (a), the probability that $C$ contains a vertex $\sigma \in S_n$ is exactly $p^n$: all $n$ edges leading from $\varnothing$ to $\sigma$ must be present. Since $|S_n| = 2^n$, we have $\mathbb E[X_n] = (2p)^n$, so $\mathbb E[X] = \sum_{n\ge 0} (2p)^n = \frac1{1-2p}$. By Markov's inequality or otherwise, this means that $X$ is finite with probability $1$.
For (b), use the second moment method. As before, $\mathbb E[X_n] = (2p)^n$, and we can write $\mathbb E[X_n^2]$ as $$\sum_{\sigma \in S_n} \sum_{\tau \in S_n} \Pr[\sigma \in C \land \tau \in C].$$ This probability depends on the overlap between $\sigma$ and $\tau$. For $0\le k < n$, there are $2^{2n-k-1}$ pairs $(\sigma, \tau)$ which agree on their first $k$ terms, but not on the $(k+1)^{\text{th}}$: we can choose $\sigma$ arbitrarily, but then the first $k+1$ terms of $\tau$ are fixed. For $k=n$, there are $2^n$ such pairs. When $\sigma$ and $\tau$ agree on exactly $k$ of their leading terms, the probability is $p^{2n-k}$ that both are in $C$. We get $$ \mathbb E[X_n^2] = (2p)^n + \sum_{k=0}^{n-1} 2^{2n-k-1} p^{2n-k}. $$ As an upper bound, extend the sum to an infinite sum, getting $$ \mathbb E[X_n^2] \le (2p)^n + \sum_{k=0}^\infty 2^{2n-k-1} p^{2n-k} = (2p)^n + \frac{2^{2n-1}p^{2n}}{1 - (2p)^{-1}} = (2p)^n + (2p)^{2n}\frac{p}{2p-1}. $$ Subtracting $\mathbb E[X_n]^2 = (2p)^{2n}$, we get $$ \text{Var}[X_n] \le (2p)^n + (2p)^{2n} \frac{1-p}{2p-1}. $$ For sufficiently large $n$, this is less than $c \cdot (2p)^{2n}$ for some $c$ such that $\frac{1-p}{2p-1} < c < 1$. By Chebyshev's inequality, $$ \Pr\left[|X_n - (2p)^n| \ge \sqrt{\frac{c+1}{2}}(2p)^n\right] \le \frac{\text{Var}[X_n]}{\frac{c+1}{2} (2p)^{2n}} \le \frac{2c}{c+1} $$ so with probability at least $\frac{1-c}{c+1}$, $X_n$ is at least $\left(1 - \sqrt{\frac{c+1}{2}}\right) (2p)^n$. This can be massaged into the form we want, and the values of $n$ that weren't sufficiently large can be dealt with by choosing a small enough $q$.