Find the conditional density of $X_1$ given that it is not the smallest of the n values among n independent uniform $(0, 1)$ random variables

153 Views Asked by At

Let $X_1,...,X_n$ be independent uniform $(0, 1)$ random variables. Find the conditional density of $X_1$ given that it is not the smallest of the $n$ values.

Here is my idea:

Let $Y$ denote that $X_1$ is not the smallest value

By intuition, $F_{X_1|Y}(x_1|y)=\frac{F_{X_1}(x_1)\cdot F_Y(y)}{F_Y(y)}=F_{X_1}(x_1)=\int_{0}^{x_1}1 \mathrm{d}x_1=x_1$

But I wonder if $Y$ and $X_1$ are independent and how to prove it strictly

2

There are 2 best solutions below

5
On BEST ANSWER

Let $Y=X_{(1)}$ be the minimum of the $X_i$. For $t\in(0,1)$ we have \begin{align} \mathbb P(Y\leqslant t) &= 1-\mathbb P(Y > t)\\ &= 1 - \mathbb P\left(\bigcap_{i=1}^n \{X_i>t\} \right)\\ &= 1 - \prod_{i=1}^n\mathbb P(X_i>t)\\ &= 1- \mathbb P(X_1>t)^n\\ &= 1-(1-t)^n, \end{align} and so the density of $Y$ is $f_Y(t)=n(1-t)^{n-1}\mathsf 1_{(0,1)}(t)$.

Let $E=\{X_1=Y\}$, then the distribution of $X_1$ conditioned on $E$ is the same as the distribution of $Y$. By the law of total probability, we have for $t\in(0,1)$ $$ 1 = f_{X_1}(t) =f_{X_1}(t\mid E)\mathbb P(E) + f_{X_1}(t\mid E^c)\mathbb P(E^c) = f_{X_1}(t\mid E)\frac 1n +f_{X_1}(t\mid E^c)\frac{n-1}n. $$ It follows that $$ f_{X_1}(t\mid E^c) = \left(\frac n{n-1}\right)\left(1 - f_{X_1}(t\mid E)\frac1n \right)=\left(\frac n{n-1}\right) (1-(1-t)^{n-1})\mathsf 1_{(0,1)}(t). $$

0
On

Let $Y=X_{(1)}$ denote the minimum of the $X_i$ , $E=\{X_1=Y\}$ denote the event that $X_1$ is the minimum among $\{X_i\}$

I am going to prove that $\mathbb P (E)=\frac {1}{n}$, similarly we can prove that the probability is the same for all $X_i$

Let $Z=X_{(1)}$ denote the minimum of the $X_2,X_3...X_n$, namely the rest $n-1$ r.v.

For $t\in(0,1)$, we have:

\begin{align} \mathbb P(Z\leqslant t) &= 1-(1-t)^{n-1} \end{align}

So the density $f_Z(t)=(n-1)(1-t)^{n-2}(t)~~~~~~~~~t\in(0,1)$

We have: \begin{align} \mathbb P(X_1~\rm is~the~minimum)&=\mathbb P(X_1\le Z)\\ &=\int_{0}^{1}\mathbb P(X_1\le t)\cdot f_Z(t) \mathrm{d}t\\ &=\int_{0}^{1}t\cdot (n-1)(1-t)^{n-2}(t)\mathrm{d}t\\ &=\frac1n \end{align}

Similarly, we have $\mathbb P(X_i$ is the minimum$)=\frac 1n$ holds for all $i=1,2,...n$

Now I am going to show that $\mathbb P(X_i$ is the second minimum$)=\frac 1n, i=1,2,...n$

Take $X_1$ as an example, let $S$ denote the event: $X_1$ is the second minimum, $E_i, i=2,3,...n$ denote $X_i$ is the minimum, notice that:

\begin{align} \mathbb P(S)&=\mathbb P(S|E_2)\mathbb P(E_2)+\mathbb P(S|E_3)\mathbb P(E_3)+...\mathbb P(S|E_n)\mathbb P(E_n)\\ &=\sum_{i=2}^{n}\mathbb P(S|E_i)\mathbb P(E_i) \end{align}

And we have: \begin{align} &\mathbb P(S|E_i)= \mathbb P(X_1~{\rm is~the~minimum~among}~X_2,X_3,..X_{i-1},X_{i+1},...X_n)=\frac {1}{n-1}\\ &\mathbb P(E_i)=\frac 1n \end{align}

So we have: \begin{align} \mathbb P(S)&=\sum_{i=2}^{n}\mathbb P(S|E_i)\mathbb P(E_i)\\ &=\sum_{i=2}^{n}\frac {1}{n-1}\cdot \frac1n\\ &=\frac 1n \end{align}

So $\mathbb P(S)=\frac 1n$, and we can know that $\mathbb P(X_i$ is the second minimum$)=\frac 1n, i=1,2,...n$

Similarly, we have:

\begin{align} \mathbb P(X_i~{\rm is~the~third~minimum})=\cdots \mathbb P(X_i~{\rm is~the~nth~minimum})=\frac 1n, i=1,2,...n \end{align}

That is: \begin{align} \mathbb P(X_i~{\rm is~the~k^{th}~minimum})=\frac 1n, i=1,2,...n,~k=1,2,\cdots n \end{align}