Absolute value of sub-gaussian

1.2k Views Asked by At

Let $\eta_1 , \eta_2, \cdots, \eta_t$ be 1-subgaussian independent random variables with mean 0(but not necessarily identical).

Now we know several nice equations about $|\sum_{s=1}^t \eta_s|$.

How about $\sum_{s=1}^t |\eta_s|$?

  1. Is there any hints about $\mathbb{E}|\eta_s|$?
  2. I believe $\sum_{s=1}^t |\eta_s|\leq O(t \log t)$ with high probability. Is it true?

(Surprisingly, there is no 'sub-gaussian' tag.......)

1

There are 1 best solutions below

0
On BEST ANSWER
  1. Any 1-subgaussian random variable $X$ satisfies that $\|X\|_p \leq C \sqrt{p}$ for an absolute constant $C$ [1, Proposition 2.5.2]. Therefore, $E |X| = \|X\|_1 \leq C $.

  2. Yes, it is true. Actually something stronger is true. Following the definitions in [1], if $X$ is $\sigma$-subgaussian ,i.e., $P(|X| \geq t) \leq 2 \exp(-ct^2/)$, then $|X|$ is also $\sigma$-subgaussian. Let $Z = |X|$. Using the centering lemma [1, Lemma 2.6.8], $\|Z - \mathbb E Z\|_{\psi_2} \leq K\|Z\|_{\psi_2} \leq K$ for an absolute constant $K$.

Applying generalized Hoeffdings's inequality[1, Theorem 2.6.2], we get that with probability $1 - \delta$, $|\sum_{i=1}^n (Z_i - \mathbb E Z)| = O(\sqrt{n \log(1/\delta)})$.

Using part 1 and triangle inequality, we get that w.p. $1 - \delta$, $\sum_{i=1}^n |X_i| = O(n) + O(\sqrt{n \log(1/\delta)})$

P.S.: If you don't want logarithmic dependence on $\delta$, then what you require is satisfied by random variables with much relaxed assumptions, say, unit variance.

[1]: High-Dimensional Probability, Roman Vershynin. https://www.math.uci.edu/~rvershyn/papers/HDP-book/HDP-book.html