How to prove the tail bound of summation of non-symmetric Bernoulli random variables?

27 Views Asked by At

Let $a_{ij}$ as the elements in adjacency matrix of Erdos-Renyi graphs. Thus $a_{ij}$ is a Bernoulli random variable:

$a_{ij}=1\text{with probability} p$, or $0\text{with probability} 1-p$

Now I would like to estimate the lower bound of the tail probability

$P(|\sum_{1\leq i<j\leq n}(a_{ij}-p)|\geq t)$

What I tried is:

$a_{ij}-p$ is a Bernoulli random variable, whose expectation is $0$, and it equals to $1-p$ with probability $p$ and equals to $-p$ with probability $1-p$.

Now I follow the similar proof of Hoeffding inequality for symmetric Bernoulli random variables.

\begin{equation} \begin{aligned} P(\sum (a_{ij}-p)\leq t)&=P(\exp(\lambda \sum (a_{ij}-p))\geq \exp(\lambda t))\\ &\leq \frac{E(\exp(\lambda \sum (a_{ij}-p)))}{\exp(\lambda t)}\\ \end{aligned} \end{equation}

Thus now we need to estimate the moment generating function $E(\exp(\lambda \sum (a_{ij}-p)))$ and this is where I got stuck.

If random variable $X_i$ are symmetric Bernoulli variables, then we have $E(\exp(\lambda \sum (X_i)))=\Pi(E(\exp(\lambda X_i)))$, and $E(\exp(\lambda X_i))=(\exp(\lambda)+\exp(-\lambda))/2=\Pi\text{cosh}(\lambda)\leq \Pi \exp(\lambda^2)/2$.

However in our case $X_i$ is $a_{ij}-p$, I don't know how to upper bound the generating funtion.