maximal bound for a sub-gaussian RV

1k Views Asked by At

By definition a RV is $\nu$-sub-gaussian, if the log of the moment generating function is bounded such that

$$ \log(\mathbb{E}[\exp(\lambda X)]) \leq \frac{\lambda^2\nu}{2} $$ Also, it can be shown that all the sub-gaussian RVs are centered i.e. $\mathbb{E}[X] = 0$.

Just now, I saw a result which proves, for a sub-gaussian RV $$ \mathbb{E}[\max\limits_{i=1\dots N} X_i] \leq \sqrt(2\nu\log N) $$

We already know that expectation is zero, then what is the point of bounding the expected maximum of the RV ?. Moreover, expectation is defined over a sample or population right ?. What is the intuition behind this bound and what exactly are we bounding ?

1

There are 1 best solutions below

0
On BEST ANSWER

Looking at the random variable $M_N=\max_{1\le i\le N}X_i$ is an example of extreme value theory, an area of probability theory which is very active at the moment. For $\{X_i\}$ mean zero, the fact that $E[X_i]=0$ implies that, roughly speaking, $X_i>0$ approximately half of the time, so for large $N$ we would expect $M_N>0$ with high probability. If $X_i$ satisfies $P(X_i\le x)<1$ for all $x$ (which is the case for Gaussian random variables, for example), then $P(M_N\le x)=P(X_1\le x)^N\to0$ as $N\to\infty$, so in fact one can show we have $M_N\to+\infty$ almost surely. The question then becomes: how quickly? If $X_i\sim\mathcal N(0,1)$, then one can show $E[M_N]\approx\sqrt{2\log N}$. The purpose of this question is to show that the maximum sub-Gaussian random variables grows no faster than this.