Kolmogorov Exponential Bounds (Upper)

189 Views Asked by At

This is one version of Kolmogorov exponential bound from Allan Gut's Probability: A Graduate Course (2005, p385-386). Let $Y_k$ be an independent sequence of random variables with zero mean and finite variances $\sigma_k^2$. Define $s_n^2 := \sum_{k=1}^n \sigma_k^2$. In addition, suppose that for $c_n>0$, $|Y_k| \leq c_ns_n \ a.s.$ for $k = 1, \dots, n$ and for all $n \geq 1$. Then for $0<x<\frac{1}{c_n}$, one has the following upper bound: $$ \mathbb P\left(\sum_{k=1}^n Y_k > xs_n \right) \leq \exp\left\{-\frac{x^2}{2}\left(1-\frac{xc_n}{2}\right)\right\}. $$

This seems to be a more general version. There are several points I do not understand in the book.

  1. What is $c_n$ and how do we find it for a given distribution? In the book, he gave an example on normal distribution $N(0, \sigma^2)$ as follows. $$\mathbb P\left(\sum_{k=1}^n Y_k > x\sigma\sqrt{n}\right) \leq \frac{1}{x}e^{-\frac{x^2}{2}}, x >0.$$ I can NOT see how to get the right hand side from the theorem. Where is the $1/x$ from? What is the $c_n$ in this case and how do we find it? Could anyone point it out for me, please?
  2. In the proof, he first tried to prove $$\psi_n(t) := \mathbb E\left[\exp\left(t\sum_{k=1}^n Y_k\right)\right] \leq \exp \left[\frac{t^2s_n^2}{2}\left(1+\frac{tc_ns_n}{2}\right)\right].(\star)$$ I do not get the following line, especially the summation term inside the brackets. Any pointers, please? $$1+ \frac{t^2}{2}\mathbb E(Y_k^2)\left(1+2\sum_{j=3}^\infty \frac{(tc_ns_n)^{j-2}}{j!}\right) \leq 1+ \frac{t^2\sigma_k^2}{2} \left(1+2tc_ns_n\sum_{j=3}^\infty \frac{1}{j!}\right). $$
  3. After proving $(\star)$ he claimed that by Markov's inequality one has $$\mathbb P \left(\sum_{k=1}^n Y_k > xs_n\right) \leq \frac{\psi_n(t)}{e^{txs_n}}.$$ How to see this, please? Thank you!
1

There are 1 best solutions below

3
On BEST ANSWER
  1. Indeed, since we do not have the condition $|Y_k|\leqslant c_ns_n$ almost surely, it is not a direct application of the result. We can deduce the estimate directly from estimates on the normal distribution or apply the result to truncated random variables.

  2. The inequality is valid if $tc_ns_n\leqslant 1$, noticing that $0\leqslant (tc_ns_n)^{j-2}\leqslant tc_ns_n$ if $j\geqslant 3$ (($\star$) cannot be valid for very large).

  3. Notice that for $t\gt 0$, $$\sum_{k=1}^nY_k>xs_n\Leftrightarrow t\sum_{k=1}^nY_k>txs_n \Leftrightarrow \exp\left(t\sum_{k=1}^nY_k\right)>\exp(txs_n).$$