Decomposition of a Bernoulli Random Variable into a sum of Random Variables

63 Views Asked by At

I am having trouble understanding a seemingly simple decomposition of a Bernoulli random variable. This is a question arising from random graph theory.

Data

For context, $A_{ij}$ is an entry in the adjacency matrix of an $n$-node random graph, $q_i$ is the intrinsic probability of node $i$ to connect to another node, $g_i$ denotes the cluster of assignment of node $i$, $C_{g_i g_j}$ is a weight affecting the probability that nodes in the cluster $g_i$ connect to nodes in the cluster $g_j$ (this weight does not depend on the $q_i$'s).

Let $q_i$ be i.i.d. random variables with a compactly supported probability measure on $(0,1)$.

Let $C_{g_i g_j} = 1 + \frac{M_{g_i g_j}}{\sqrt{n}}$ where $M_{g_i g_j} = \mathcal{O}(1)$

Let $A_{ij} \sim \textrm{Bernoulli}(q_i q_j C_{g_i g_j})$

Then it is possible to write the following equality of random variables:

$A_{ij} = q_i q_j + q_i q_j \frac{M_{g_i g_j}}{\sqrt{n}} + X_{ij}$

where $\mathbb{E}X_{ij} = 0$ and $\textrm{var}(X_{ij})= q_i q_j (1 - q_i q_j) + \Theta(n^{-1/2})$.

Question

At this stage, it is quite unclear how the decomposition was obtained. Furthermore, it is not obvious to me how this decomposition produces values equal to $0$ or $1$ without further assumptions on the $X_{ij}$'s. If someone could explain to me how this decomposition was produced, or point me towards similar results, that would be greatly appreciated.

1

There are 1 best solutions below

1
On BEST ANSWER

If $A \sim \text{Bernoulli}(p)$ then $X := A-p$ is a random variable satisfying $P(X= 1-p)=p$ and $P(X=-p) = 1-p$. You can check that $E[X]=0$ and $\text{Var}(X) = \text{Var}(A) = p(1-p)$. Then you can rewrite $A$ as $A=p+X$.

Taking $p := q_i q_j C_{g_i g_j}$ and applying the assumptions on $M_{g_i g_j}$ and $n$ should give you the desired decomposition.