While investigating the risk consistency of a particular machine-learning algorithm, I encountered the following convergence problem. I would appreciate it if someone could point me in the right direction.
Given the function $$ a_{pq}: \mathbb{N} \rightarrow \mathbb{R},\quad n \mapsto \sum_{m = 0}^{n - 1} \sum_{k = 0}^{n - m - 1} \binom{n}{m} \binom{n - m}{k} p^m (1 - p)^{n - m} q^k (1 - q)^{n-m-k} \text{,} $$ I want to prove that $\lim_{n \rightarrow \infty} a_{pq}(n) = 1$ for all choices of $p \in [0, 1)$ and $q \in [0, 1)$.
My current idea is to show that $a_{pq}(n) \leq 1$ and there is a simpler function $b_{pq}(n) \leq a_{pq}(n)$ with $\lim_{n \rightarrow \infty} b_{pq}(n) = 1$. Then, $\lim_{n \rightarrow \infty} a_{pq}(n) = 1$ follows naturally. I am not successful in finding such a function, though.
A second idea is to express $a_{pq}$ as a hypergeometric function for which there are many results on convergence.
Example: For $p = 0.2$ and $q = 0.2$, $a_{pq}(1) = 0.64$, $a_{pq}(2) \approx 0.87$, $a_{pq}(3) \approx 0.95$, and $a_{pq}(4) \approx 0.98$.
Thanks a lot.
For any $x,y$ and $N\in\mathbb N$, the binomial theorem implies
$$\sum_{k=0}^{N-1}{N\choose k}x^ky^{N-k}=(x+y)^N-x^N.$$ Using this, it follows \begin{align} a_{pq}&=\sum_{m=0}^{n-1}{n\choose m}p^m(1-p)^{n-m}\sum_{k=0}^{n-m-1}{n-m\choose k}q^k(1-q)^{n-m-k}\\[.5em] &=\sum_{m=0}^{n-1}{n\choose m}p^m(1-p)^{n-m}(1-q^{n-m})\\[.5em] &=\sum_{m=0}^{n-1}{n\choose m}p^m(1-p)^{n-m}-\sum_{m=0}^{n-1}{n\choose m}p^m\big(q(1-p)\big)^{n-m}\\[1em] &=(1-p^n)-\big((p+q(1-p))^n-p^n\big)\\[1.2em] &=1-(p+q(1-p))^n. \end{align} Since $p,q\in[0,1)$ we have $0\leq p+q(1-p)<p+(1-p)=1$, this implies $(p+q(1-p))^n\to0$ and therefore $a_{pq}\to1$.