I've read all the questions related to this, but I couldn't find an answer.
We have n independent Bernoulli variables $X_i \in Be(p_i)$ where all the $p_i$ have the same distribution, let's say given through the expectation and variance: $E[p_i]= \mu$ and $Var[p_i]=\sigma$ (if it helps, we could also assume that we have given the probability Distribution of $p_i$).
Now we are interested in the sum of the Bernoullis, that is $X:= \sum_{i=1}^n X_i$ . If the $x_i$ were identically distributed, X would be binomial distributed $X \in Bin(n,p)$ but here (as each $X_i$ has its own parameter $p_i$) X has no easy to write down distribution (or does it? I think that would be the convolution of the $X_i$). But we can say that $$E[X]=\sum_{i=1}^n p_i$$ and $$Var[X]=\sum_{i=1}^n p_i (1-p_i)$$
What can we say about $X- E[X]$? With Hoeffding inequality we can get that
$$P[X-E[X] \geq t] \leq e^{\frac{-2t^2}{n}}$$ and
$$P[|X-E[X]| \geq t] \leq 2e^{\frac{-2t^2}{n}}$$
if I unterstood that right. But does it help me anything that I know that all the $p_i$ have the same distribution?
How can we relate X and Y where $Y:=\sum_{i=1}^n Y_i$ where $Y_i \in Be(\mu)$ and how can we relate X and Z where $Z:=\sum_{i=1}^n Z_i$ where $Z_i \in Be(p)$ for some $p$.
More specific: I am interested in approximating $X$ by $E[X]$ and I know that this is quite easy to do so for Y and $E[Y]$ (if Y is binomial distributed). So I'd really like to show that asymptotically (for $n \rightarrow \infty$) there is no difference between $X$ and $Y$.
I don't expect you to solve this task for me, but I have no idea what to look for and I would really appreciate any hint or link to a paper.. (I googled "sum of Independent Bernoullis" and looked at many papers, but I don't think that they capture the fact that the Bernoullis have a parameter with the same distribution..)
Thank you very much for your help..
(If $n=2$ we have $X_1 \sim Be(p_1)$, $X_2 \sim Be(p_2)$, $p_1$ and $p_2$ are iid with d.f. $F(x)$ with support $[0,1]$. Then
$P(X_1+X_2=0)=P(X_1=0)P(X_2=0)=(\int_0^1 P(X_1=0|p_1=u)P(p_1 =u))(\int_0^1 P(X_2=0|p_2=u)P(p_2 =u))$
$=(\int_0^1uP(p_1=u))(\int_0^1uP(p_2=u))=(Ep_1)(Ep_2)=\mu^2$
$P(X_1+X_2=2)=P(X_1=1)P(X_2=1)=(\int_0^1 P(X_1=1|p_1=u)P(p_1 =u))(\int_0^1 P(X_2=1|p_2=u)P(p_2 =u))$
$=(\int_0^1(1-u)P(p_1=u))(\int_0^1(1-u)P(p_2=u))=(E(1-p_1))(E(1-p_2))=(1-\mu)^2$
$P(X_1+X_2=1)=1-\mu^2-(1-\mu)^2=2\mu(1-\mu)$