Asymptotics of sum of binomial distributions

391 Views Asked by At

Definition 1: For any random variable $X$, we define $\mathrm{Bin}(p,X)$ as a variable with binomial distribution having parameters $p$ and $X$.

Definition 2: For all $i \in \mathbb{N}$, define recursively the following random variables, $$X_{i+1} = \mathrm{Bin}(p, X_{i}^1) + \mathrm{Bin}(1-p, X_{i}^2),$$ where the two random variables "$\mathrm{Bin}$" in the sum are independent for every $i\in \mathbb{N}$ and where $X_{i}^1$ and $X_{i}^2$ are independent and distributed the same as $X_i$. The initial random variable $X_0$ is given and it has expectation $\lambda>0$.

Question 1: Is the probability distribution of $X_\infty$ well defined and unique?

Question 2: If yes, does the probability distribution of $X_\infty$ solve the following equation, $X_\infty = \mathrm{Bin}(p, X_{\infty}^1) + \mathrm{Bin}(1-p, X_{\infty}^2)$, where $X_\infty^1$ and $X_\infty^2$ have the same probability distribution of $X_\infty$ and the variables in the sum are independent? Clearly for all $i\in \mathbb{N}$, $\mathbb{E}[X_i] = \mathbb{E}[X_0]=\lambda$.

This question is related to my other question, Solution of equation of binomial random variables, where the equation $X = \mathrm{Bin}(p, X^1) + \mathrm{Bin}(1-p, X^2)$ has been solved, under the assumption that $X$, $X^1$ and $X^2$ are independent and identically distributed.

1

There are 1 best solutions below

0
On

As explained in the comments, this is really the iteration of a transformation of distributions, not random variables, and the technique used to find the fixed points in the other question can be adapted to solve the asymptotics.

To be brief (since this is a rehashing of arguments already explained), the sequence of generating functions $\varphi_n(s)=E(s^{X_n})$ solves the recursion $$\varphi_{n+1}(s)=\varphi_n(p+qs)\varphi_n(q+ps)=\varphi_n(1-q(1-s))\varphi_n(1-p(1-s)),$$ where $q=1-p$, hence $$\log\varphi_n(s)=\sum_{k=0}^n{n\choose k}\log\varphi_0(1-p^kq^{n-k}(1-s)). $$ Assuming that $X_0$ is square integrable with $E(X_0)=\lambda$, one knows that, when $t\to0$, $$\varphi_0(1-t)=1-\lambda t+O(t^2),$$ hence, for some suitable $A$ and $B$, $$-\lambda t+At^2\leqslant\log\varphi_0(1-t)\leqslant-\lambda t+Bt^2,$$ for every $t$. Thus, on the one hand, $$\log\varphi_n(s)\leqslant\sum_{k=0}^n{n\choose k}\left(-\lambda p^kq^{n-k}(1-s)+Bp^{2k}q^{2(n-k)}(1-s)^2\right),$$ that is, $$\log\varphi_n(s)\leqslant-\lambda(1-s)+B\cdot(p^2+q^2)^n(1-s)^2. $$ On the other hand, $$\log\varphi_n(s)\geqslant-\lambda(1-s)+A\cdot(p^2+q^2)^n(1-s)^2. $$ Since $p^2+q^2\lt1$, this shows that $\log\varphi_n(s)\to-\lambda(1-s)$ for every $s$, that is, that the distributions converge to the Poisson distribution with parameter $\lambda=E(X_0)$.

To sum up:

Question 1: Is the probability distribution of $X_\infty$ well defined and unique?

Yes.

Question 2: If yes, does the probability distribution of $X_\infty$ solve the following equation, $X_\infty = \mathrm{Bin}(p, X_{\infty}^1) + \mathrm{Bin}(1-p, X_{\infty}^2)$, where $X_\infty^1$ and $X_\infty^2$ have the same probability distribution [as] $X_\infty$ and the variables in the sum are independent?

Yes.