Understanding $P(\max\{X_{1},\ldots,X_{n}\} \le y) = P(X_{1} \le y,\ldots,X_{n} \le y)$.

60 Views Asked by At

I understand the logical statement that: $$\max\{{X_{1},\ldots,X_{n}}\}(\omega) \le y \iff X_{1}(\omega) \le y,\ldots,X_{n}(\omega) \le y$$ But my issue falls with $\max\{\}$ itself. Shouldn't the output from this function be a single random variable? Why is say $\max\{1,2,3,3\} = 3$, but the $\max\{A, B, C\}(\omega) \ne C(\omega)$? Where A, B, and C are random variables.

I would have assumed that $P(\max\{A, B, C\}(\omega) \le d) = P(C(\omega) \le d)$ Where $C(\omega)$ maps an outcome to the largest value in the real line, as opposed to $$P(\max\{A,B,C\}(\omega) \le d) = P(A(\omega) \le d, B(\omega) \le d, C(\omega) \le d)$$

Hopefully someone can point me in the right direction.

Edit: missed some omegas.

1

There are 1 best solutions below

1
On

In short, you would be right if these were real numbers, like $\{1, 2, 3, 3\}$. You are saying you want to write something like this:

Suppose $\max\{X_1, X_2, X_3, \dots, X_n\} = X_k$. Then $P(\max\{X_1, X_2, X_3, \dots, X_n\} \leq y) = P(X_k \leq y)$.

However, the problem is the $X$'s are random, so $k$ is not fixed! Suppose that $n = 3$ for simplicity. On the first experiment, we may get $\max\{X_1, X_2, X_3\} = X_2$. On the second experiment, we may get $\max\{X_1, X_2, X_3\} = X_1$. On the third experiment, we may again get $\max\{X_1, X_2, X_3\} = X_2$. And so on...

This means even defining $\max\{X_1, X_2, X_3, \dots, X_n\} = X_k$ would require a probability distribution on $k$, and hence it makes the problem equally as hard or (more likely) even harder (see order statistics).

It is easier to apply the equivalence you mentioned to get individual events for each $X_i \leq y$ for $i \leq n$, especially if you know the $X$'s have some nice properties (like independence). For example, this comes up a lot in auction theory for government contracting, where we assume each bidder's valuation function (modeled as a random variable $X_i$) is independent.