Prove that the number that the root of a given tree connect is decreasing in exponential.

192 Views Asked by At

Consider a regular rooted tree, each vertex has $d$ offsprings. Now for every edge there's a probability $p$ choosing it and probability $1-p$ not choosing it, and the root of the tree can connect with many vertexes denoted as $X_p$. Define $\delta(p)=\mathbb{P}[X_p=\infty]$. Prove that

1、There exists a real number $p_d$ such that when $p>p_d$ we have $\delta(p)>0$, and when $p<p_d$ we have $\delta(p)=0$

2、Prove that when $p<p_d$ we have a constant $C,D>0$ such that $\mathbb P[X_p\ge k]\le De^{-Ck}$

My idea for problem $1$ is, each vertex has the same probability $\delta(p)$ to have infinite offspring connected, with this we can set up an equation about $\delta(p)$ which has an zero root. However I've got no idea of doing problem $2$, and I don't have any idea of setting an equation/inequality about it. Could you provide any hints/suggestions for me?

Thank you!

1

There are 1 best solutions below

7
On BEST ANSWER

That property is called exponential tails, and it is linked to having exponential moments by the Chernov inequality $$ \mathbb P(X_p ≥ k) = \mathbb P(e^{CX_p} ≥ e^{Ck}) ≤ \frac{\mathbb E[e^{CX_p}]}{e^{Ck}} = \mathbb E[e^{CX_p}]e^{-Ck}. $$ This follows from the Markov inequality that for non-negative $X$, $\mathbb E[X] ≥ \mathbb P(X≥k)k$. So if you can prove that for some $C>0$, the exponential moment $\mathbb E[e^{CX_p}]$ is finite, you are finished.

Edit 1: To show that, you can consider $X_{p,k}$ the amount of vertexes that are connected with the root (including the root) and at most $k$ steps away. $X_{p,0}$ is 1. The root is has $d$ independent children, and if it is connected to one is independent and has probability $p$. This gives us a recursion $$ X_{p,k} = 1 + \sum_{i=1}^d B_iX_{p,k-1}^{(i)} $$ where $B_i$ is 1 with prob. $p$ and $0$ else, and $X_{p,k-1}^{(i)}$ is an independent copy of $X_{p,k-1}$ and all random variables on the right are independent of each other.

So $$ \mathbb E\left[e^{CX_{p,k}}\right] = \mathbb E\left[e^{C + C\sum_{i=1}^d B_i X_{p,k-1}}\right] = e^C \left(\mathbb E\left[e^{C B_i X_{p,k-1}}\right]\right)^d. $$ By depending on $B_i$, we have \begin{align} &= e^C \left(p\mathbb E\left[e^{CX_{p,k-1}}\right] + 1 - p\right)^d\\ &= e^C \left(p\left(\mathbb E\left[e^{CX_{p,k-1}}\right]-1\right) + 1\right)^d, \end{align} a recursion where we only have to prove that it does not go to infinity for some $C>0$ and starts with $\mathbb E\left[e^{CX_{p,0}}\right] = e^C$ because $X_{p,k} \to X_p$ for $k\to\infty$.

Edit 2: Let us now write this with a function, $f(x):=e^C(p(x-1)+1)^d$, so that $\mathbb E\left[e^{CX_{p,k}}\right] = \underbrace{f\circ\dots\circ f}_{k \text{ times}}(e^C)$. This function is a polynomial with a single $d$-fold root at $x=1-\frac1p$. The fixpoints are the intersections with the identity function $x\mapsto x$. If $C=0$, then $f(1) = 1$ and $f'(1) = dp < 1$. So for $C=0$, there is a fixpoint there, and we want to show that we still have one for $C>0$ sufficiently small. The taylor series of $f(x)-x$ close to zero is $$ 0 \stackrel!= f(x)-x = e^C-1 + (e^Cpd-1)(x-1) + O\left((x-1)^2\right). $$ Without the error term, $x= 1 + \frac{e^C-1}{1-e^Cpd} < e^C$ would be a fixpoint, but since this goes to zero for $C\to 0$, we can ignore the error term. So there must be a fixed point $x_{\text{fix}}>1$. Because $f$ is monotonous, for any value $x≤x_{\text{fix}}$ we have $f(x)≤f(x_{\text{fix}}) = x_{\text{fix}}.$ Because we start smaller than the fixpoint, we converge.