Ordering of maximums from exponential random variables

36 Views Asked by At

Let $E_1,\ldots,E_n$ be exponential random variables with parameters $(\alpha_1,\ldots,\alpha_n)$. Further for $1\leq z\leq n$, let $M_1=\max\{E_1,\ldots,E_z\}$ and $M_2=\max\{E_{z+1},\ldots, E_{n}\}$. What is $\mathrm{Pr}(M_1<M_2)$? If $z=n-1$ a simple expression emerges $$ \prod_{i=1}^{n-1}\frac{\alpha_i}{\alpha_i+\alpha_n} $$ is there something as nice in the general case?

2

There are 2 best solutions below

0
On BEST ANSWER

I can write it as a sum over permutations; if we use rate parameters $r_i=\frac1{\alpha_i}$, the probability that the exponentials are in order $E_{\sigma(1)} < E_{\sigma(2)} <\cdots < E_{\sigma(n)}$ is $\frac{r_1r_2\cdots r_n}{r_{\sigma(n)}(r_{\sigma(n)}+r_{\sigma(n-1)})\cdots(r_{\sigma(n)}+r_{\sigma(n-1)}+\cdots+r_{\sigma(1)})}$. In product and sum notation, that's $$\frac{\prod_{i=1}^n r_i}{\prod_{j=1}^n\left(\sum_{k=j}^n r_{\sigma(k)}\right)}$$ Want the probability that some particular $E_i$ is the maximum? Sum over the permutations $\sigma$ with $\sigma(n)=i$. The probability that the maximum is in some subset $S$? Sum over $\sigma$ with $\sigma(n)\in S$.

This is not pretty, and it gets more complicated as the number of exponentials grow, in stark contrast to the minimum that always stays nice. That minimum, by the way? If we sum over $\sigma$ with $\sigma(1)=i$, all of the terms that don't involve $r_{\sigma(1)}=r_i$ balance out to $1$ and we get $$\frac{r_i}{\sum_{j=1}^n r_j}$$ That doesn't happen with the maximum. I worked out $n=3$; the probability that $E_1$ is largest is $$\frac{r_2r_3(2r_1+r_2+r_3)}{(r_1+r_2)(r_1+r_3)(r_1+r_2+r_3)}=\frac{\alpha_1^2(\alpha_1\alpha_2+\alpha_1\alpha_3+2\alpha_2\alpha_3)}{(\alpha_1+\alpha_2)(\alpha_1+\alpha_3)(\alpha_1\alpha_2+\alpha_1\alpha_3+\alpha_2\alpha_3)}$$ For $n=4$, we're looking at sums of six terms. I could do that, but i'd need quite a bit more scratch space.

0
On

Let $F(x)=P(M_1\le x)=\prod_{i=1}^zP(E_i\le x)$ and $G(x)=P(M_2\le x)=\prod_{i=z+1}^nP(E_i\le x)$ $P(M_1\le M_2|M_2=x)=F(x)$. By integrating over all values of $M_2$ we get $P(M_1\le M_2)=\int_0^\infty F(x)G'(x)dx$.

For this special problem $P(E_i\le x)=1-e^{-\alpha_i x}$, writing it out looks messy.