Prove that regression beta of order statistics converges to 1?

61 Views Asked by At

I was just working on a toy problem and was curious how to prove it. Suppose $(X_1, \dots , X_n)$ and $(Y_1, \dots , Y_n)$ are both independent vectors of N(0,1) random variables. If we graph these as points $ (X_1,Y_1), \dots (X_n, Y_n)$ we will just see a shroud around 0, with correlation 0. Now if we sort them first however and pair up the order statistics, we get $(X_{(1)}, Y_{(1)}), \dots (X_{(n)},Y_{(n)})$ and these will clearly have correlation near 1. How can I prove this convergence though? Writing out the formula for correlation, I get that the correlation will converge to the same thing as$$ \frac{1}{n} \sum_{i=1}^n X_{(i)}Y_{(i)}$$

But this is difficult to prove convergence of, since they are not iid, and the distribution of $X_i$ changes as a function of $n$. Also what type of convergence is it? Can I get a.s convergence from this? I assume so since it is a LLN type result, but still I cannot prove it.