Convergence in probability of $r_n$th order statistic to associated quantile.

173 Views Asked by At

Suppose $X_1,\dots,X_n$ is a random sample from a distribution $F$. Let $0<p<1$. Suppose that $q$ is such that $F(q-)<p<F(q)$. Show that $$P(X_{[r_n]}=q) \rightarrow1$$

if $$(r_n-np)n^{-1/2}\rightarrow0$$

My intuition is that as the sample size grows, X's should accumulate at the jump point. However; I can't get any manipulation of $P(X_{[r_n]}=q)$ that doesn't involve the BLTS that are general CDFs of order statistics. A sketch or starting point/strategy would be appreciated

Attempt at solution (admittedly shaky) $$P(|X_{r_n}-q|<\epsilon)$$ $$= P(q-\epsilon<F^{-1}(U_{r_n})<\epsilon +q)$$ $$= P(n^{1/2}(F(q-\epsilon)-p)<n^{1/2}(U_{r_n}-p)<n^{1/2}(F(q+\epsilon)-p)$$

As n gets large, the middle of the inequality should be asymptotically normal centered at 0, with bounds going to -/+ (resp) infinity, so this should go to one?

I'm very unsure about replacing $X_{r_n}$ with $F^{-1}(U_{r_n})$. I cannot prove this identity, it is mere intuition.

****to anyone reading this in the future, this is incorrect, it only shows a.s convergence, not convergence to a single point. ****