Conditional expected value of mutlitple draws from uniform distribution

268 Views Asked by At

There are $m$ i.i.d. draws of $x$ made from a uniform distribution on $[0,1]$. The $n$ ($n\leq m$) lowest draws are "winners", i.e. if we write $x_1\leq\ldots\leq x_n\ldots\leq x_m$, the draws $x_1$ to $x_n$ are "winning draws".

Now player/draw $i$ learns his $x_i$ and the fact that he is a winner, i.e. $i\in[1,n]$. What is the expected value of the remaining $(n-1)$ winning draws, given $i$'s knowledge of his own draw and the fact that he is a winner (but not knowing his "rank"/position among the winners).


EDIT: With the help of the comments below, this is what I have managed to do (credits to the commentators!):

  • The unconditional expected value of the winning draws is $\frac{1}{n}\sum_{i=1}^n \frac{i}{1+m}=\frac{n+1}{2(m+1)}$.
  • Obviously, the unconditional cdf is given by $F(x)=x$ for $x\in[0,1]$.
  • $i$ can be the 1st, 2nd, ... nth of the winners. For each rank, compute the expected value of the remaining $n-1$ winners (distributed below/above him, depending on $i$'s rank) and weight it by the probability of this rank.
  • The expected value of the remaining winners, conditional on $x_i$ and the knowledge that $i$ is a "winner", can hence be computed by (incomplete and possibly wrong!):

\begin{alignat*}{3} \frac{1}{n-1}\bigg(% &(1-x_i)^{m-1} &\cdot\binom{m-1}{0} &[(1-1)\frac{x_i}{2} &+\sum_{k=1}^{n-1}(x_i+(1-x_i)E[x_{(k)}^{m-1}])]\\ + &(1-x_i)^{m-2}x_i^1 &\cdot\binom{m-1}{1} &[(2-1)\frac{x_i}{2} &+\sum_{k=1}^{n-2}(x_i+(1-x_i)E[x_{(k)}^{m-2}])]\\ + &\ldots\\ + &(1-x_i)^{m-j-2}x_i^{j-1} &\cdot\binom{m-1}{j-1} &[(j-1)\frac{x_i}{2} &+\sum_{k=1}^{n-j}(x_i+(1-x_i)E[x_{(k)}^{m-j}])]\\ + &\ldots\\ + &(1-x_i)^{m-n-1}x_i^{n-1} &\cdot\binom{m-1}{n-1} &[(n-1)\frac{x_i}{2} &+\sum_{k=1}^{n-n}(x_i+(1-x_i)E[x_{(k)}^{m-n}])]% \bigg) \end{alignat*}


QUESTIONS:

  1. Is the above correct? For example, am I missing a normalisation?
  2. Is it correct that if in addition $i$ also knew his "rank", rather than using the summation above, he would only consider the respective summand indicating the correct position. (From the comments, this seems to be correct.) How does this change a potential normalisation?
  3. Is it correct that $$ E[x_{(k)}^{m-j}]=\frac{k}{m+1-j} $$ in the equation above, with $x_{(k)}^{m-j}$ being (if I understand it correctly) the k-th lowest out of $m-j$ iid draws?
  4. If I was also interested in the expected square of the other "winners" (not the square of the expected other winners), I would need to modify the formula above such that:
    • replace $(j-1)\frac{x_i}{2}$ by $$E[\sum_{k=1}^{j-1}x_k^2|x_k\sim U(0,x_i), \text{ iid}]=\frac{x_i^2 (2j-1)}{6j}$$
    • and $E[x_{(k)}^{m-j}]$ by $$E[(x_{(k)}^{m-j})^2]=(\frac{k}{m+1-j})^2$$

Also, should I delete my lengthy (and wrong) comments below? Should I make this question more "canonical" (and if so, how)?