Upper bounding a tricky sum

151 Views Asked by At

For a problem in probability, I'm trying to find an upper bound for

$$ \sum_{d=0}^k\binom{k}{d}\gamma^d(1-\gamma)^{k-d}\left(1-p^d(1-p)^{k-d}\right)^m$$

which will help me analyze what values of $k$, $m$ and $p$ are needed to make the sum go to 0. (Note $0< p,\gamma<1$ and $k$ and $m$ are integers.)

I have tried to approximate the inner part of the right parenthesis using the exponential function, and the binomial part using a normal distribution, but the $p^d$s seem to stand in the way for this approach.

If I expand the parenthesis, I can rewrite the sum as

$$ \sum_{l=0}^m\binom{m}{l}(-1)^l\left(\gamma p^l+(1-\gamma)(1-p)^l\right)^k$$

But that doesn't help me much.

I've run out of ideas to proceed, do you know any good tricks I should check out?

1

There are 1 best solutions below

1
On BEST ANSWER

Define $\alpha \in (0, 1)$ and $\beta \in (0, \infty)$ as

$$ \alpha = p^{\gamma}(1-p)^{1-\gamma} \quad \text{and} \quad \beta = \frac{p}{1-p}. $$

Also we write $X = X_k \sim \operatorname{Binomial}(k, \gamma)$ and denote the sum by $S = S_{m,k}$. Then we can write

$$ S = \Bbb{E}[ (1 - \alpha^k \beta^{X-k\gamma} )^m ]. \tag{1} $$

Since $p^X (1-p)^{k-X} \leq \max \{p, 1-p\}^k \to 0$ uniformly as $k \to \infty$, for any $\epsilon > 0$, we have

$$ \exp(-(1+\epsilon)\color{red}{\alpha^k \beta^{X-k\gamma}}) \leq 1 - \color{red}{\alpha^k \beta^{X-k\gamma}} \leq \exp(-\color{red}{\alpha^k \beta^{X-k\gamma}}) $$

whenever $k$ is sufficiently large. Now consider $m$ as a sequence of $k$ and write $a_k = m \alpha^k$. Then this inequality implies

$$ \Bbb{E}[\exp(-(1+\epsilon)a_k \beta^{X-k\gamma})] \leq S \leq \Bbb{E}[\exp(-a_k \beta^{X-k\gamma})] \tag{2} $$

for large $k$.

If $\gamma$ and $p$ are fixed, then we can find a condition for $m = m_k$ which guarantees $S \to 0$ as $k \to \infty$.

Claim. $S = S(m_k, k)$ converges to $0$ if and only if $m_k \alpha^k r^{\sqrt{k}} \to \infty$ as $k \to \infty$ for any $r > 0$.

Sufficient condition

Consider first the case $\beta \leq 1$. Let $x \in \Bbb{R}$ be arbitrary. Then by (2),

\begin{align*} S &\leq \Bbb{E}[ \exp ( -a_k \beta^{X-\gamma k} ) \mathbf{1}_{\{ (X-\gamma k)/\sqrt{k} < x\}} ] \\ &\qquad +\Bbb{E}[ \exp ( -a_k \beta^{X-\gamma k} ) \mathbf{1}_{\{ (X-\gamma k)/\sqrt{k} \geq x\}} ] \\ &\leq \exp( -a_k \beta^{x\sqrt{k}} ) + \Bbb{P}((X-\gamma k)/\sqrt{k} \geq x). \end{align*}

Combining the assumption and the classical CLT, this gives an estimate to the limsup of $S$ as follows :

$$ \limsup_{k\to\infty} S \leq \Phi(-x/\sqrt{\smash[b]{\gamma(1-\gamma)}}), $$

where $\Phi$ is the CDF of the standard normal distribution. Now taking $x \to \infty$ we get the desired conclusion when $\beta \leq 1$. The other case $\beta \geq 1$ can be treated similarly, and we omit the proof.

Necessary condition

Again let us focus on the case $\beta \leq 1$. Let $r > 0$ be arbitrary, and let $x = \log r / \log \beta$. Then by (2),

\begin{align*} S &\geq \Bbb{E}[ \exp(-(1+\epsilon)a_k \beta^{X-k\gamma}) \mathbf{1}_{\{ (X-\gamma k)/\sqrt{k} \geq x \}} ] \\ &\geq \exp(-(1+\epsilon)a_k r^{\sqrt{k}}) \Bbb{P}((X-\gamma k)/\sqrt{k} \geq x). \end{align*}

Rearranging everything w.r.t. $a_k r^{\sqrt{k}}$, we get

$$ a_k r^{\sqrt{k}} \geq \frac{1}{1+\epsilon}\log \left( \frac{\Bbb{P}((X-\gamma k)/\sqrt{k} \geq x)}{S} \right) . $$

From the classical CLT and the assumption, the right-hand side diverges to $+\infty$ as $k \to \infty$.