An elementary application of Integral identity

205 Views Asked by At

Given that $$\mathbb{P}\bigg(|X|>C\cdot (\sigma\sqrt{\log{n}+u}+K(\log{n}+u))\bigg)<2e^{-u},$$ show that $$\mathbb{E}\bigg(|X|\bigg)<C'\cdot(\sigma\sqrt{\log{n}+1}+K(\log{n}+1))$$ where $C,C'>0$ are some absolute constant, and $K, \sigma>0$.

The textbook (Vershynin high dimensional probability exercise 5.4.11) says that I'm supposed to use integral identity $\mathbb{E}\bigg(|X|\bigg)=\int_0^\infty \mathbb{P}(|X|>a) da$ to show this. What I did is to do a change of variable $a=C(\sigma\sqrt{\log{n}+u}+K(\log{n}+u))$. Then I have $$\mathbb{E}\bigg(|X|\bigg)\le \int_{-\log{n}}^\infty 2e^{-u} C(K+\sigma\frac{1}{\sqrt{\log n +u}})du=nCK+nC\sigma\int_0^\infty e^{-t^2}dr$$ which does not solve the problem as it is linear to $n$ instead of $\log n$ as required.

1

There are 1 best solutions below

0
On BEST ANSWER

I think there are two ways to do this. First is to use the $\max(1,U(t))$ integral identity trick: $U(t)$ is the tail bound and we solve $U(t)=1$ to find out where we should $\int_0^r1dt+\int_r^\infty U(t)dt$.

But specific to this question of mine, it is easiest to consider u as a random variable U such that $|X|=R(U)$ ($R(U)$ is RHS of the tail bound). This way, since R(U) is monotone increasing, $P(U>u)=P(|X|>R(u))\le 2e^{-t^2/2}.$ Then U is just subgaussian with constant subgaussian norm. Take expectation on both sides (and use Jensen's inequality to put expectation inside sqrt).