I am having trouble understanding the concept of order statistics. If I have $X_1, \ldots, X_n$ random variables, I can define the order statistics (which are alse random variables) $X_{(1)}, \ldots, X_{(n)}$ which are a sorting of $X_1, \ldots, X_n$.
I can't sort random variables; instead I have to sort their realizations. So , when I sort these realizations, the random variables are no longer functions since they are a value now, (a realization), so I don't understand why $X_{(1)}, \ldots, X_{(n)}$ are random variables, since they are a sorting of realizations. Can someone help me?
Concept of order statistics
156 Views Asked by user91684 https://math.techqa.club/user/user91684/detail AtThere are 5 best solutions below
On
If you do not sort the rv's they are not order statistics.
Consider the following example.
Let $X,Y$ be iid uniform $U(0;1)$ their realizations can take any value in $(0,;1)$ indpendently, but when the realizations are done you can always sort them. They are not order statisics. As per independence, their covariance is $Cov(X,Y)=0$
Now let's $X,Y$ be the same rv $U(0;1)$ but now let $X$ be the minimum of the two random variables. Now they are ordered so they are Order Statistics. Their covariance now is not zero but $\frac{1}{36}$
On
They are the composition of measurable functions with the random vector $X=(X_{1}, \ldots X_{n})$, so they are random variables (since composition of measurable function with random variable is a random variable).
Take for example the case $n=2$. Then we have two random variables $X_1,X_2$, so $X=(X_1,X_2)$ is a random vector. In this case your $X_{(1)}$ would be $X_{(1)}=\min(X)$ and your $X_{(2)}=\max(X)$. The functions $g=\min$ and $h=\max$ are measurable, so $X_{(1)}$ and $X_{(2)}$ are random variables.
For a generic $n$ you would consider $n$ different measurable functions $g_i$ that would give you the $i$-th greatest component of $X=(X_{1}, \ldots X_{n}$).
On
How do you define de function "minimum of two rv's"? That's what I don't get
Let $X,Y$ be two independent rv's
Let's define $U=min(X,Y)$
To characterize a rv it is enough to have its CDF, thus by definition
$$P(U>u)=P(X>u,Y>u)=\text{using independence}=P(X>u)P(Y>u)=[1-F_X(u)][1-F_Y(u)]$$
Thus
$$F_U(u)=1-[1-F_X(u)][1-F_Y(u)]$$
On
The minimum of two random variables is a function of those random variables, and so is a random variable itself. It is a similar concept to the sum of two random variables also being a random variable.
For example if $X_1$ and $X_2$ are i.i.d. uniform on $[0,1]$ then the minimum $X_{(1)}$ has a distribution on $[0,1]$ but more likely to be low than high. Its density on this interval turns out to be $f_{\min}(x)=2-2x$, while the maximum $X_{(2)}$ has a density of $f_{\max}(x)=2x$; the two are not independent and in this example have a correlation of $+0.5$.
As I see, you have a problem with just the definition.
Note that $X_1,...,X_n:\Omega \to \mathbb R$ are random variables (that is measurable functions).
For given $\omega \in \Omega$, you have certain values $X_1(\omega),...,X_n(\omega)$.
Those certain values can be sorted (maybe sorting is not unique). In other words, there exists permutation $\pi_{\omega}:\{1,...,n\} \to \{1,...,n\}$ such that:
$$ X_{\pi_{\omega}(1)}(\omega) \le ... \le X_{\pi_{\omega}(n)}(\omega)$$
Now, the point is, you define $X_{(k)}:\Omega \to \mathbb R$ with the formula: $$ X_{(k)}(\omega) = X_{\pi_{\omega}(k)}(\omega)$$