Proving the independence of $\frac{x_{(r)} - x_{(1)}}{x_{(n)} - x_{(1)}}$ and $(x_{(1)}, x_{(n)})$

83 Views Asked by At

Let $(X_1, \ldots, X_n)$ be an i.i.d. variables such that $X_i \sim U[0, 1]$. I need to prove that $\xi = \frac{x_{(r)} - x_{(1)}}{x_{(n)} - x_{(1)}}$ is independent from $(x_{(1)}, x_{(n)})$ for all $r:1 < r <n$, where $x_{(i)}$ is an order statistic.

First of all I tried to compute $f_{\xi}(x\mid x_{(1)} = p, x_{(n)} = q)$ in order to show that distribution function does not depend on $p, q$, but result is not very promising(at least I think it is):

$$ f_{\xi}(x\mid x(1)=p,x(n)=q) = f_{\xi}(x) = f_{x_{(r)}}((q - p)x + p) = $$ $$ = \frac{\Gamma(n + 1)}{\Gamma(r)\Gamma(n + 1 - r)}(p + x(q - p))^{r - 1}(1 - p - x(q - p))^{n - r}, $$ since $x_{(r)}\sim B(r, n + 1 - r).$

After that I presumed that every attempt to brute force the soluton will face similar computational difficulties and didn't really pursue the idea of computing $f_{(x_{(1)}, x_{(n)}, \xi)}(x, y, z)$ in order to show the divisibility of it into two parts. Insted I tried numerical simulation and came up with histogram of $\xi$ with $n = 10, r = 7$:histogram of 7th order statistic

which supports the intuitive idea that $\xi$ is distributed like $r-1$ order statistic of the batch of size $n - 2$ of i.i.d. $U[0,1]$ variables(really it just sounds too good to not be true so maybe this presumption should not be taken into account).

After that I spent a little more time to try and find a way to succeed but, alas, that didn't happen so I would really appreciate any help.

1

There are 1 best solutions below

0
On BEST ANSWER

A computationally simple way to show this is to use Basu's theorem.

Suppose $X_1,X_2,\ldots,X_n$ are i.i.d $\text{Uniform}(\theta-\sigma,\theta+\sigma)$ where $\theta\in \mathbb R$ and $\sigma\in \mathbb R^+$.

Then $(X_{(1)},X_{(n)})$ is a complete sufficient statistic for $(\theta,\sigma)$. Sufficiency follows from Factorization theorem; for completeness, see similar question here.

Now note that $\frac{X_i-\theta}{\sigma}\sim \text{Uniform}(-1,1)$ for every $i=1,\ldots,n$, so that its distribution is free of $(\theta,\sigma)$. Hence the distribution of $\frac{X_{(r)}-\theta}{\sigma}$ is also free of $(\theta,\sigma)$ for every $r=1,\ldots,n$.

As a result the distribution of $\frac{X_{(r)}-X_{(1)}}{X_{(n)}-X_{(1)}}$ is independent of $(\theta,\sigma)$:

$$\frac{X_{(r)}-X_{(1)}}{X_{(n)}-X_{(1)}}=\frac{(X_{(r)}-\theta)/\sigma - (X_{(1)}-\theta)/\sigma}{(X_{(n)}-\theta)/\sigma-(X_{(1)}-\theta)/\sigma}$$

In other words $\frac{X_{(r)}-X_{(1)}}{X_{(n)}-X_{(1)}}$ is an ancillary statistic for $(\theta,\sigma)$, whence it is independent of the complete sufficient statistic $(X_{(1)},X_{(n)})$ by Basu's theorem.

This is true for every $(\theta,\sigma)$, so in particular true for $\theta=\sigma=\frac12$.