Kolmogorov like Maximal inequality with exponential expected value

680 Views Asked by At

Let $X_i$ a countable collection of independent random variables with symmetric distribution i.e. $P(X_i\in A)=P(X_i \in -A)$ for all $i\geq 1$. If $\lambda\in\mathbb{R}$ is such that $E(e^{\lambda X_i})$ is finite for all $i$ I want to prove the following Kolmogorov like inequality $$P\left(\max_{1\leq k\leq n} \left|\sum_{i=1}^{k}X_{k}\right|>t\right)\leq e^{-\lambda t}\prod_{i=1}^{n}E(e^{\lambda X_i })$$ Have you some hint for solution?

2

There are 2 best solutions below

1
On

I think this bound is incorrect, because the right hand side is too small. For example, if we let $X_i$ to be i.i.d. Rademacher variable, i.e., $P(X_i = 1) = P(X_i = -1) = 1/2$, then any $\lambda\in \mathbb{R}$ can make $E(e^{\lambda X_i})<\infty$. Take $t = 1/2$ and $n=1$, then $P(|X_1|>t) = 1$, but the right hand side is $$ \frac{1}{2}e^{-\lambda/2}(e^{\lambda} + e^{-\lambda}) = \frac{1}{2}[e^{\lambda/2} + e^{-3\lambda/2}]. $$ If we draw the plot of $\frac{1}{2}[e^{\lambda/2} + e^{-3\lambda/2}]$, we can find that the it can take value that is smaller than $1$. Specially, the minimum value of it, achieved at $\frac{1}{2}\ln 3$, is $0.87738<1$. Plot of function \frac{1}{2}[e^{\lambda/2} + e^{-3\lambda/2}]$$

In the following part, I will prove a similar bound using martingale theory. This bound is a little bit worse than the original bound and needs a stronger condition $E[e^{\lambda|X_i|}]<\infty$.

Since the distribution of $X_i$ is symmetric, $E(e^{\lambda X_i}) = E(e^{-\lambda X_i})$. Thus we can assume that $\lambda>0$. By Jensen inequality, $e^{\lambda{E}X_i}\leq E(e^{\lambda X_i})<\infty$, so $EX_i<\infty$, and this together with the fact that $X_i$ has symmetric distribution implies that $EX_i=0$. Therefore $S_k = \sum_{i=1}^kX_i$ is a martingale.

Since $f(x) = e^{|\lambda x|}$ is a convex function, we know that $e^{ |\lambda S_k|}$ is a submartingale. Now by the Doob's inequality for submartingale (Theorem 5.4.2 at Probability: Theory and Examples by Rick Durrett), we have $$ P(\max_{1\leq k\leq n}e^{|\lambda S_k|}>e^{\lambda t})\leq e^{-\lambda t}Ee^{|\lambda S_n|}. $$ Furthermore, if we can assume $E[e^{\lambda|X_i|}]<\infty$, then we can use independence to get

$$ P(\max_{1\leq k\leq n}e^{|\lambda S_k|}> e^{\lambda t}) \leq e^{-\lambda t}Ee^{|\lambda S_n|} \leq e^{-\lambda t}\cdot E e^{\sum_{i=1}^n \lambda|X_i|} = e^{-\lambda t}\prod_{i=1}^n(Ee^{\lambda|X_i|}) $$

The last point is to notice that $$ P(\max_{1\leq k\leq n}e^{|\lambda S_k|}>e^{\lambda t}) = P(\max_{1\leq k\leq n}|\lambda S_k|>{\lambda t}) = P(\max_{1\leq k\leq n}| S_k|>{t}). $$ Therefore the upper bound I can get is $$ P(\max_{1\leq k\leq n}| S_k|>{t})\leq e^{-\lambda t}Ee^{|\lambda S_n|}\leq e^{-\lambda t}\prod_{i=1}^n(Ee^{\lambda|X_i|}). $$

1
On

Using extrapolation I am able to prove your statement but only for $t$ large relative to $\lambda^{-1}$ and with the stronger condition of $E[e^{\lambda |X_i|}] < \infty$ (maybe you were missing an absolute value). Let us denote by $S$ the random variable $$ S = \sum_{j = 1}^\infty X_j $$ and by $\Sigma_n$ the $\sigma$-algebra generated by the random variables $X_1,...X_n$. Using the fact that each $X_j$ is of mean $0$, the conditional expectation $E_n: L^1(\Omega) \to L^1(\Omega; \Sigma_n)$ satisfies that $$ E_n(S) = \sum_{j = 1}^n X_j := S_n. $$ By Doob's Theorem and Marcinkiewicz interpolation we have that \begin{align} P \Big( \max_{1 \leq k \leq n} |E_k(S_n)| > t \Big) & \lesssim \frac{\|S_n\|_1}{t}\\ P \Big( \max_{1 \leq k \leq n} |E_k(S_n)| > t \Big) & \lesssim \Big( \frac{p}{p-1}\Big) \frac{\|S_n\|^p_p}{t^p}. \end{align} Therefore \begin{align} (e^{t \lambda} -1) \, P \Big( \max_{1 \leq k \leq n} |E_k(S_n)| > t \Big) & \lesssim \sum_{p=1}^\infty \frac{t^p \lambda^p}{p!} P \Big( \max_{1 \leq k \leq n} |E_k(S_n)| > t \Big)\\ & \lesssim \sum_{p=1}^\infty \frac{t^p \lambda^p}{p!} \frac{E [|S_n|^p ]}{t^p}\\ & = E \Big[ \sum_{p=1}^\infty \frac{\lambda^p |S_n|^p }{p!} \Big]\\ & = E \Big[ \mathrm{exp}\big(\lambda |S_n|\big) - 1 \Big] \leq E \Big[ \mathrm{exp}\big(\lambda |S_n|\big) \Big]. \end{align} But now, using independence, we obtain that $$ E \Big[ \mathrm{exp}\big(\lambda |S_n|\big) \Big] \leq \prod_{k = 1}^n E \big[e^{\lambda |X_k|}\big]. $$ In sum, we have $$ P \Big( \sup_{1 \leq k \leq n} \Big| \sum_{k=1}^n X_k \Big| > t \Big) \lesssim \frac{1}{e^{t \lambda} - 1} \prod_{k = 1}^n E \big[e^{\lambda |X_k|}\big]. $$