Why does a concentration inequality imply equivalence of moments and exponential integrability of the square?

206 Views Asked by At

I have the following basic question about a concentration inequality:

Let $X$ be a random variable, denote by $m$ a median and assume that for every $t>0$ one has

$$ P \left( \left| X - m \right| > t \right) \leq C e^{-t^2/\lambda} $$

for some positive constants $C$ and $\lambda$.

Why does this imply that $\operatorname{E} \left( e^{a X^2} \right) < \infty$ for some $a>0$ and that all moments are equivalent, i.e. for all $0<p,q<\infty$ one has $\left\lVert X \right\rVert_p \leq K_{p,q} \left\lVert X \right\rVert_q$, where $\left\lVert X \right\rVert_p^p = \operatorname{E} \left( \left| X \right|^p \right)$ for positive constants $K_{p,q}$?

Background: I have condensed the above statement from Theorem 4.7 on page 100 of the book Probability on Banach spaces by Ledoux and Talagrand. However, in case there are necessary assumptions I have stripped, here is the full statement for reference:

Let $X$ be a Rademacher series in a Banach space $B$ such that for some countable subset $D$ in the unit ball of $B^{\ast}$ it holds that $\left\lVert x \right\rVert = \sup_{f \in D} \left| f(x) \right|$. Denote by $M$ a median of $\left\lVert X \right\rVert$. Then

$$ P \left( \left| \left\lVert X \right\rVert - M \right| > t \right) \leq 4 \exp \left( -t^2/8\sigma^2 \right), $$

where $\sigma = \sup_{f \in D} \sqrt{\operatorname{E} \left( f^2(X) \right)}$. In particular, there exists $\alpha>0$ such that $\operatorname{E} \left( \exp \left( \alpha \left\lVert X \right\rVert^{2} \right) \right) < \infty$ and all moments of $X$ are equivalent: that is, for any $0<p,q<\infty$, there is a constant $K_{p,q}$ depending on $p,q$ only such that $\left\lVert X \right\rVert_p \leq K_{p,q} \left\lVert X \right\rVert_q$.

1

There are 1 best solutions below

2
On BEST ANSWER

The exponential tails follow by applying Markov inequality and the finiteness of $\mathbb E e^{aX^2}$. Such random variables are called subgaussian random variables. For more details, see [1, Proposition 2.5.2].

The equivalence of moments is a rather subtle issue. The subgaussian assumption would imply that $\|X\|_p = O(\sqrt{p})$ for all $p\geq 1$. However, the equivalence of moments does not hold in general without additional assumptions, and that is where it is crucial that $X$ is a Rademacher series. This equivalence of moments for Rademacher Series is also related to the famous Khinchine-Kahane inequality. See, for example, Wikipidea, A short proof by Rafał Latała; Krzysztof Oleszkiewicz, a recent talk by Tomasz Tkocz.

[1] High-Dimensional Probability, Roman Vershynin. https://www.math.uci.edu/~rvershyn/papers/HDP-book/HDP-book.html