Properties, bounds and limits about difference of two inverse standard normal CDF variables and extreme value distribution

142 Views Asked by At

I'm interested in the variable: $$\sigma_n=\Phi^{-1}\left(1-{1\over n}e^{-1}\right)-\Phi^{-1}\left(1-{1\over n}\right),$$ where $\Phi(\cdot)$ is the CDF of standard normal distribution. I want to prove $\sigma_n$ is monotonically decreasing for $n\ge 2$, which is shown by my simulation, but I do not know how to prove it. If this property holds, then it implies that $\sigma_n$ also has a non-negative limit as $\sigma_n>0$ for all $n\ge 2$.

Or you can just tell me $\sigma_n$ is bounded or has a finite limit.

More info:

$\sigma_n$ comes from the distribution $GEV(x;\mu_n,\sigma_n,0)$, which is the distribution of the max of $n$ i.i.d. standard normal distribution (Extreme value Type I distribution). See https://en.wikipedia.org/wiki/Generalized_extreme_value_distribution.

My simulation shows:

  1. when $n$ increases from 2 to 2000, $\sigma_n$ decreases from 0.9 to 0.27

  2. $n=10, 10^2, ..., 10^7$, $\sigma_n=0.90,0.51,0.35,0.29,0.25,0.22,0.20,0.18$, respectively

1

There are 1 best solutions below

2
On BEST ANSWER

We have to start $$\Phi^{-1}\left(1-\frac{x}{n}\right) = \sqrt{2}\ \text{erf}^{-1}\left(1-\frac{2x}{n}\right)$$ which makes $$\sigma_n=\sqrt{2}\Bigg[\text{erf}^{-1}\left(1-\frac{1}{en}\right)-\text{erf}^{-1}\left(1-\frac{1}{n}\right) \Bigg]$$

For small $x$ we have $$\text{erf}^{-1}\left(1-x\right)=\sqrt{\frac{1}{2} \left(\log \left(\frac{2}{\pi x^2}\right)-\log \left(\log \left(\frac{2}{\pi x^2}\right)\right)\right)}$$ (have a look here). It is very good for $0\leq x \leq 0.1$.

Using it, we have $$\sigma_n=\sqrt{\log \left(\frac{2 e^2 n^2}{\pi }\right)-\log \left(\log \left(\frac{2 e^2 n^2}{\pi }\right)\right)}-$$ $$\sqrt{\log \left(\frac{2 n^2}{\pi }\right)-\log \left(\log \left(\frac{2 n^2}{\pi }\right)\right)}$$ and the limit is $0$.

Making $n=10^k$, the table contains the approximate and exact values of $\sigma_n$ $$\left( \begin{array}{ccc} k & \text{approximation} & \text{exact} \\ 1 & 0.430284 & 0.443256 \\ 2 & 0.328501 & 0.328637 \\ 3 & 0.272163 & 0.271588 \\ 4 & 0.236707 & 0.236187 \\ 5 & 0.211987 & 0.211577 \\ 6 & 0.193551 & 0.193231 \\ 7 & 0.179145 & 0.178892 \\ 8 & 0.167499 & 0.167296 \\ 9 & 0.157836 & 0.157671 \\ 10 & 0.149656 & 0.149518 \\ 11 & 0.142615 & 0.142498 \\ 12 & 0.136474 & 0.136354 \\ 13 & 0.131056 & 0.131152 \\ 14 & 0.126231 & 0.126569 \\ 15 & 0.121898 & 0.133751 \end{array} \right)$$