Inequality between Expectation and Quantiles

155 Views Asked by At

For a sample of independent observations $X_1,X_2,...,X_n$ on a continuous distribution $F$, let the ordered sample values be $X_{(1)},X_{(2)},...,X_{(n)}$. From the theory of order statistics, the density function $g(x)$ of the maximum variable $X_{(n)}$ is known to be $$g(x)=n\,f(x)\,[F(x)]^{n-1}, $$ with $f(x)=F'(x)$.

On the other hand, let $q_\alpha=F^{-1}(\alpha)$ be the quantil function with respect to $1/2<\alpha<1$.

Is it possible to prove the following inequality

$$q_\alpha\leq\int_{-\infty}^\infty\,x\,g(x)\,dx\quad?$$

1

There are 1 best solutions below

3
On BEST ANSWER

if you have F such that $F(-B)=0.3$, $F(0)=0.5$ and $F(1)=1$

You have $q_\alpha>=0$ for $\alpha \ge 0.5$

and $E=\int_{-\infty}^\infty\,x\,g(x)\,dx \le -B(0.3)^n+1$ (as you have a probability $(0.3)^n$ to draw all your sample bellow -B)

you can chose $B\gt\frac{1}{(0.3)^n}$ such that $E \lt 0 \le q_\alpha$

So the inequality is not always true.