Let $Y_1, Y_2, ...$ be independent and identically distributed random variables in $(\Omega, \mathscr{F}, \mathbb{P})$ s.t. their distributions are continuous and $$F_{Y}(y) := F_{Y_1}(y) = F_{Y_2}(y) = ...$$
For $m = 1, 2, ...$ and $i \le m$, define $$A_{i,m} := (\max\{Y_1, Y_2, ..., Y_m\} = Y_i), B_m := A_{m,m}$$
Is it really that $$P(B_m) = P(A_{m-1,m}) = ... = P(A_{2,m}) = P(A_{1,m})$$
and hence $$P(B_m) = 1/m?$$
I guess $(A_{(i,m)})_{i \le m}$'s pairwise disjoint, for a fixed m and that
$$\sum_{i=1}^{m} P(A_{(i,m)}) = P(\bigcup_{i=1}^{m} A_{(i,m)}) = 1$$
I don't really understand why all the $P(A_{(i,m)})$'s are equal
For example why is it that $$P(A_{1,2}) := P(Y_1 \ge Y_2) = P(Y_2 \ge Y_1) := P(A_{2,2})$$ Why 1/2 each? Why not 1/4, 3/4? Or even 1, 0? I'm guessing this has something to do w/ independence or continuity.
Assuming the random variables are absolutely continuous (What if the pdfs don't exist? :O),
$$P(B_m) = \int_{\mathbb R} \int_{-\infty}^{y_n} \cdots \int_{-\infty}^{y_n} \int_{-\infty}^{y_n} f_{Y_1, ..., Y_n} (y_1, ..., y_n) dy_1dy_2...dy_{n-1}dy_n$$
By independence, we have
$$ = \int_{\mathbb R} \int_{-\infty}^{y_n} \cdots \int_{-\infty}^{y_n} \int_{-\infty}^{y_n} f_{Y_1}(y_1) \cdots f_{Y_n}(y_n) dy_1dy_2...dy_{n-1}dy_n$$
$$ = \int_{\mathbb R} [\int_{-\infty}^{y_n} \cdots \int_{-\infty}^{y_n} \int_{-\infty}^{y_n} f_{Y_1}(y_1) \cdots f_{Y_{n-1}}(y_{n-1}) dy_1dy_2...dy_{n-1}] f_{Y_n}(y_n) dy_n$$
$$ = \int_{\mathbb R} [F_{Y_1}(y_n) ... F_{Y_{n-1}}(y_n)] f_{Y_n}(y_n) dy_n$$
By identical distribution,
$$ = \int_{\mathbb R} [F_{Y_n}(y_n) ... F_{Y_{n}}(y_n)] f_{Y_n}(y_n) dy_n$$
$$ = \int_{\mathbb R} [F_{Y_n}(y_n)]^{n-1} f_{Y_n}(y_n) dy_n$$
$$ = [F_{Y_n}(y_n)]^{n-1} F_{Y_n}(y_n)|_{-\infty}^{\infty} - \int_{\mathbb R} [F_{Y_n}(y_n)]^{n-1} (n-1) f_{Y_n}(y_n) dy_n$$
$$ = [(1)(1) - (0)(0)] - \int_{\mathbb R} [F_{Y_n}(y_n)]^{n-1} (n-1) f_{Y_n}(y_n) dy_n$$
$$ = 1/n$$
If $Y_1,\dots,Y_n$ are iid and $\sigma:\{1,\dots,n\}\to\{1,\dots,n\}$ is a permutation then: $$F_{Y_{\sigma(1)},\dots Y_{\sigma(n)}}(y_1,\dots,y_n)=F(y_1)\times\cdots\times F(y_n)=F_{Y_1,\dots,Y_n}(y_1,\dots,y_n)$$ where $F$ is the common CDF of the $Y_i$.
So the random vectors $\langle Y_{\sigma(1)},\dots,Y_{\sigma(n)}\rangle$ and $\langle Y_1,\dots,Y_n\rangle$ have equal distribution.
Consequently the events $\{Y_n=\max(Y_1,\dots,Y_n)\}$ and $\{Y_{\sigma(n)}=\max(Y_1,\dots,Y_n)\}$ have equal probability.