Finding the marginal distribution of $X_{(1)}/X_{(n)}$ when $X_i \sim\text{iid} \operatorname{Unif}(0,\theta)$

77 Views Asked by At

I am doing a practice problem in preparation for an exam, but I have hit a dead end.

Let $X_1, \dots, X_n$ be i.i.d. observations from Unif$(0,\theta)$ distribution. Use the joint distribution of $(X_{(1)}, X_{(n)})$ to show that $V=\frac{X_{(1)}}{X_{(n)}}$ is independent of $W=X_{(n)}$.

Let $Y_i=X_{(i)}$. For the joint p.d.f. of $Y_1, Y_n$ I get $$ f_{Y_1, Y_n}(y_1, y_n)=n^3(n-1)\frac{y_n^{n-1}}{\theta^{n+1}}(1-\frac{y_1}{\theta})^{n-1}\bigg[\frac{y_n^{n}}{\theta^{n}}-1 +(1-\frac{y_1}{\theta})^n\bigg]^{n-2}. $$

From here I tried to find $f_{V,W}(v,w)$ and show it factors into $f_V(v)f_W(w)$. The later p.d.f. is easy to find, but $f_V(v)$ is not so easy. I tried to use the fact that $$ f_V(v)=\int_{\mathcal{W}}f_{V,W}(v,w) \, dw=\int_0^{\theta}n^3(n-1) \frac{w^{n-1}}{\theta^{n+1}}(1-\frac{vw}{\theta})^{n-1}\bigg[\frac{w^{n}}{\theta^{n}}-1 +(1-\frac{vw}{\theta})^n\bigg]^{n-2} \, dw, $$ but I can't figure out how to integrate this. I was thinking a clever substitution would let me lean on the Beta function, but if it exists, I can't find it. Is there a way to circumvent the need to find $f_V(v)$ (I can't use Basu's theorem for this)?

2

There are 2 best solutions below

0
On BEST ANSWER

Easy answer:

  1. $X\sim U(0;\theta)$ belongs to a scale family

  2. $W=X_{(n)}$ is CSS (Complete and Sufficient Statistic) for $\theta$

  3. $V=\frac{X_{(1)}}{X_{(n)}}$ is scale invariant thus it is ancillary for $\theta$

Now you can apply Basu's Theorem concluding that $V\perp\!\!\!\perp W$


If you want to use the hint of the exercise, a further easy way to proceed is the following.

To simplify the notation, let's set $x<y$ the minimum and the maximum of the $n$ observations, thus

$$f_{XY}(x,y)=n(n-1)\frac{(y-x)^{n-2}}{\theta^n}\cdot\mathbb{1}_{(0<x<y<\theta)}$$

Now set the following system

$$ \left\{ \begin{array}{c} v=\frac{x}{y} \\ w=y \end{array} \right. \rightarrow \left\{ \begin{array}{c} x=vw\\ y=w \end{array} \right. $$

being the jacobian $|J|=w$ you get

$$f_{VW}(v,w)=n(n-1)\frac{[w(1-v)]^{n-2}}{\theta^n}\cdot w=$$

$$=\frac{n(n-1)}{\theta^n}w^{n-1}(1-v)^{n-2}=$$

$$=\frac{nw^{n-1}}{\theta^n}\cdot\mathbb{1}_{(0;\theta)}(w)\times (n-1)(1-v)^{n-2}\cdot \mathbb{1}_{(0;1)}(v)=f_W(w)\cdot f_V(v)$$

which is exactly independence's definition

1
On

I have no idea how you derived that supposed joint order-statistic probability density function.

For $\mathcal U(0..\theta)$ random variables we have the probability density function, $f(x)=\theta^{-1}\mathbf 1_{0\leqslant x\leqslant\theta}$, and cumulative density function, $F(x)=x\theta^{-1}\mathbf 1_{0\leqslant x\lt\theta}+\mathbf 1_{\theta\leqslant x}$.

So the joint probability density function for the least and most order statistic for a sequence of $n$ independent and distributed-as-above random variables will be:

$$\begin{align}f_{Y_1,Y_n}(y,z) &= \dfrac{n!}{1!~(n-2)!~1!} f(y)\,(F(z)-F(y))^{n-2}\, f(z) \\[1ex] & = n(n-1)(z-y)^{n-2}\theta^{-n}\mathbf 1_{0\leqslant y\lt z\leqslant \theta} \end{align}$$

Being: the count for selecting a least and most sample from the sequence, times the pdf of samples having those values ($y$ and $z$), and the probability that the remaining $n-2$ samples lie between those values.


Elsewise, derive it from the probability that all samples are at most $z$ but not all are above $y$. $$\begin{align}f_{Y_1,Y_n}(y,z) &= \dfrac{\mathrm d^2}{\mathrm d y\,\mathrm d z}\left(\mathsf P({\bigcup}_{j=1}^n \{X_j\leqslant z\})-\mathsf P({\bigcup}_{j=1}^n \{y\lt X_j\leqslant z\})\right)\\[1ex] &= \dfrac{\mathrm d^2}{\mathrm d y\,\mathrm d z}\left(F(z)^n-(F(z)-F(y))^n\right)\end{align}$$


Then, noting that surely $0\leqslant Y_1/Y_n\leqslant 1$, you should use the Jacobian transformation: $$\begin{align}f_{Y_1/Y_n, Y_n}(v,w)&=\left\lvert\dfrac{\partial vw}{\partial w}\right\rvert f_{Y_1,Y_n}(vw,w)\\[2ex]f_{Y_1/Y_n}(v) &=\mathbf 1_{0\leqslant v\leqslant 1} \int_\Bbb R \left\lvert\dfrac{\partial vw}{\partial v}\right\rvert f_{Y_1,Y_n}(vw,w)\,\mathrm d w\\[1ex]&~~\vdots\end{align}$$