This is not a homework question but something that occured to me while studying for an examination. Is there a "shortcut" way of calculating this?
There are $20$ independent uniform random variables, each denoting an event, $X_i $ ~ $ U(0,1)$ for $1 \leq i \leq 20$ (uniform continuous). What is the average time until the second event takes place?
I said the following:
$P(\text{first two events happen} \leq t) = P(X_1 \leq t)P(X_2 \leq t)P(X_3 \geq t) ... P(X_{20} \geq t) + P(X_1 \leq t)P(X_3 \leq t)P(X_2 \geq t)...P(X_{20} \geq t)...$
Meaning, it is equal to the following: $ \left(\matrix{20 \\ 2}\right)t^2(1-t)^{18}$
But I get a negative number when trying to calculate the mean (average) of this (derive and calculate mean), so my approach is clearly wrong. What is the correct way of tackling this, and what is wrong with my approach?
Understanding that $P(0 \le X_i <t)=t\; |\,0 \le t <1$, then
thus the probability density is $p_2(t)=n(n-1)t(1-t)^{n-2}$, and the average will be $$ \eqalign{ & E(t_{\,2} ) = n\left( {n - 1} \right)\int_{t = 0}^1 {t^{\,2} \left( {1 - t} \right)^{\,n - 2} dt} = n\left( {n - 1} \right){\rm B}(3,n - 1) = n\left( {n - 1} \right)\left( {{{2!\left( {n - 2} \right)!} \over {\left( {n + 1} \right)!}}} \right) = \cr & = {{2!} \over {\left( {n + 1} \right)}} = {2 \over {21}} \cr} $$ where ${\rm B}(x,y)$ is the Beta Function
Note that for $n=3$ we would get $ E(t_{\,2} ) =1/2$, which is to be expected for symmetry.
Addendum
Note that in case we were looking for the time of occurrence of the third event, then the probability density would be: $$ p_{\,3} (t) = {{n\left( {n - 1} \right)\left( {n - 2} \right)} \over {2!}}t^{\,2} \left( {1 - t} \right)^{\,n - 3} $$
The division by $2!$ is due to that $t^2$ is the probability of the first two events to occur in whichever order while $n(n-1)$ counts the ways that they occur orderly (and shall then be associated to a probability $\int_{\tau _{\,2} = 0}^{\,t} {\tau _{\,2} \left( {\int_{\tau _{\,1} = 0}^{\tau _{\,2} } {\tau _{\,1} d\tau _{\,1} } } \right)d\tau _{\,2} } $).
Thus in the general case (probability of $m$-th event occurring at $t$) we have: $$ \bbox[lightyellow] { p_{\,m} (t) = {{n^{\,\underline {\,m\,} } } \over {\left( {m - 1} \right)!}}t^{\,m - 1} \left( {1 - t} \right)^{\,n - m} }$$ where $n^{\,\underline {\,m\,} } $ denotes the Falling Factorial.
In fact $$ \eqalign{ & \int_{t = 0}^1 {p_{\,m} (t)dt} = {{n^{\,\underline {\,m\,} } } \over {\left( {m - 1} \right)!}}\int_{t = 0}^1 {t^{\,m - 1} \left( {1 - t} \right)^{\,n - m} dt} = {{n^{\,\underline {\,m\,} } } \over {\left( {m - 1} \right)!}}\;{\rm B}(m,n - m + 1) = \cr & = {{n^{\,\underline {\,m\,} } } \over {\left( {m - 1} \right)!}}{{\left( {m - 1} \right)!\left( {n - m} \right)!} \over {n!}} = 1 \cr} $$ and $$ \eqalign{ & \sum\limits_{1\, \le \,m\, \le \,n} {p_{\,m} (t)} = \sum\limits_{1\, \le \,m\, \le \,n} {{{n^{\,\underline {\,m\,} } } \over {\left( {m - 1} \right)!}}t^{\,m - 1} \left( {1 - t} \right)^{\,n - m} } = \cr & = n\sum\limits_{1\, \le \,m\, \le \,n} {\left( \matrix{ n - 1 \cr m - 1 \cr} \right)t^{\,m - 1} \left( {1 - t} \right)^{\,\left( {n - 1} \right) - \left( {m - 1} \right)} } = n \cr} $$ i.e. $P (\text{any event in}[t,t+dt))=n dt/T$ as should be.
Note about your approach
Your calculation is correct, as far as it returns the probability that two events (whatever) occur before $t$ and the remaining at or after $t$.
Therefore it includes the cases $[t_{1},t_{1}+dt_{1})<[t_{2}+dt_{2}) < [t,1)$ integrated over $t_{1},t_{2}$, so it is somewhat of a cumulative probability. But to the purpose of calculating $E(t_2)$ this cannot be used, because of two bugs:
- the value $t$ is assigned also to the case $t_2<t$;
- the remaining events are taken to occur always after $t$, not after $t_2$ (so you cannot use the derivative of your probability).