Let $B_i\sim Bernoulli(p)$ be $n$ i.i.d Bernoulli random variables and $C_j\sim Bernoulli(1-p)$ be $n$ more i.i.d. Bernoulli random variables.
I would like to calculate:
$$E\left[\frac{\sum_{i=1}^n a_i B_i + \sum_{j=1}^n b_j C_j}{\sum_{i=1}^n B_i + \sum_{j=1}^n C_j}\right]$$
with $a_i$ and $b_j$ be all in $[-1, 1]$. I'm particularly interested in the case $b_j=0$, $\forall j=1,\ldots,n$.
I figured that the sum of the denominator follows a Poisson Binomial distribution with mean $n$.
I've checked empirically and it seems that when $b_j=0$, the expected value is $p\cdot \frac{1}{n}\sum_{i=1}^n a_i$, but I want to prove it formally. If I could separate into the division of expectations, everything would check out, but I know that, in general, that doesn't hold.
Is there any known result that can help with this?
For large $n$, and disregarding the event where the denominator is zero (the probability should turn negligible), we can assume the numerator and denominator behave as two independent approximately gaussian variables.
Now we use this reasoning: if we have $Z=X/Y$ where $X$ and $Y$ are independent then (asympotically, under certain conditions):
$$ \mu_Z \approx \mu_X/\mu_Y$$ $$ \sigma^2_Z \approx \frac{\mu_X^2}{\mu_Y^2}\left( \frac{\sigma_X^2}{\mu_X^2} +\frac{\sigma_Y^2}{\mu_Y^2} \right)$$
In our case
$$ \mu_X = A n p + B n (1-p) \hskip{1cm} \sigma^2_X= (A' + B')np(1-p) $$ $$ \mu_Y = n p + n (1-p) =n \hskip{1cm} \sigma^2_Y= 2np(1-p) $$
where $A=\frac{1}{n}\sum a_i$ , $A'= \frac{1}{n}\sum a_i^2$ etc. Notice that $|A| \le 1$, $|A'| \le 1$.
(Granted, our $X,Y$ are not really independent, but we can expect that at least the expression for the mean is still valid)
Then
$$\mu_Z \approx A p + B(1-p)$$
which reduces (for $B=0$) to your empirical result.
As for the variance:
$$\sigma_Z^2 \approx \frac{p(1-p)}{n} \left[\left( 2 {{B}^{2}}-4 A B+2 {{A}^{2}}\right) {{p}^{2}}+\left( 4 A B-4 {{B}^{2}}\right) p+{B'}+2 {{B}^{2}}+{A'} \right] \to 0$$