Standard normal distribution tails: Lower bound

49 Views Asked by At

I read the book Foundations of Machine Learning page 443 and wonder how to prove (D.19), i.e. the following inequality regarding standard normal distribution lower bound:

If N is a random variable following the standard normal distribution, then for $u\ge0$: $$\mathbb{P}[N\ge u]\ge\frac{1}{2}(1-\sqrt{1-e^{-u^2}})$$

I search in Google and ask GPT-4 but none of them helps.

1

There are 1 best solutions below

0
On BEST ANSWER

After days of exploration, I find that the above inequality can be proved by the conclusion in G. Polya (1949) and J. D. Williams (1946): $$\Phi(x)-\frac{1}{2}=\int_{0}^{x}(2\pi)^{-\frac{1}{2}}e^{-\frac{1}{2}t^2}dt\le\frac{1}{2}(1-e^{-\frac{2}{\pi}x^2})^{\frac{1}{2}}$$

Then, we have \begin{align} \mathbb{P}[N\ge u]& =1-\mathbb{P}[N\le u]\\ &=1-\Phi(u)\\ &\ge\frac{1}{2}(1-\sqrt{1-e^{-\frac{2}{\pi}u^2}})\\ &\ge\frac{1}{2}(1-\sqrt{1-e^{-u^2}}) \end{align}

Here $\Phi(x)=\int_{-\infty}^{x}(2\pi)^{-\frac{1}{2}}e^{-\frac{1}{2}t^2}dt$