My question stems from self-study of question 91 in page 372 of Ross's Introduction to probability models, 12th edition.
The answer is given in the textbook but I am trying to understand a specific step that I suspect depends on Lebesgue integration.
For clarity I present the entire problem.
Let $X_1, ..., X_n$ be iid Exponential RV with rate $\lambda$ and $M=\max_j X_j$. Show $$ P\left \{ M > \sum_{i=1}^n X_i - M \right \} = \frac{n}{2^{n-1}} $$
A hint is given: What is $P\{X_1 > \sum_{i=2}^n X_i \}$?
The hint suggests the following approach:
$$ P\left \{ M > \sum_{i=1}^n X_i - M \right \} = \sum_{k=1}^n P\left \{ M > \sum_{i=1}^n X_i - M \bigg| M = X_k \right \} P\left \{M=X_k \right \} $$
and $$ \begin{split} P\left \{X_1 > \sum_{i=2}^n X_i \right \} &= \int_0^\infty \left ( P\{X_1 > Y | X_1 = x \} f_{X_1}(x) \right ) dx \\ &= \int_0^\infty \left ( P \{x > Y \} f_{X_1}(x) \right ) dx \end{split} $$
Where $Y \sim Gamma[n-1,\lambda]$.
This leads to
$$ P\left \{X_1 > \sum_{i=2}^n X_i \right \} = \frac{1}{2^{n-1}} $$
How do you compute $P\{M=X_k\}$? or am I setting this wrong? The book answer is the sum as if $P\{M=X_k\}=1$. I suspect there is some special Lebesgue integration.
Anyone can shed some light on the issue?
Thanks! GMercier
You seem to have proved that
$$ P\left \{ M > \sum_{i=1}^n X_i - M \bigg| M = X_k \right \} = P\left \{X_1 > \sum_{i=2}^n X_i \right \} = \frac{1}{2^{n-1}}$$
Assuming that's right, then you don't need to compute $P\{M=X_k\}$, because your conditional is a constant, hence it goes outside the summation - and of course you must have $\sum_k P\{M=X_k\}=1$ (your guess $P\{M=X_k\}=1$ is clearly absurd, both because of the above, and also by intuition).
The problem is that the first equality above is wrong. When you condition on the event $M=X_1$, it's true that the event $ M > \sum_{i=1}^n X_i - M $ can be expressed $X_1 >\sum_{i=2}^n X_i $... but you can't forget the condition. And $P(X_1 >\sum_{i=2}^n X_i) \ne P(X_1 >\sum_{i=2}^n X_i | X_1 = M) $
What we know (why?) is this: $ X_1 >\sum_{i=2}^n X_i \implies X_1 = M $
Then $$P(X_1 >\sum_{i=2}^n X_i \mid X_1 = M) = \frac{P(X_1 >\sum_{i=2}^n X_i \cap X_1 = M)}{P( X_1 = M)}=\frac{P(X_1 >\sum_{i=2}^n X_i) }{P( X_1 = M)}$$
Can you go on from here?
( Granted, this implies that we didn't need to use conditionals, only total probability:)
$$ P( M > \sum_{i=1}^n X_i - M)\\ =\sum_k P( M > \sum_{i=1}^n X_i - M \cap M = X_k) \\ =\sum_k P(X_k >\sum_{i\ne k} X_i) \\= n P(X_1 >\sum_{i=2}^n X_i)$$