Probability statement not clear

99 Views Asked by At

Maximum order statistic of a random sample from a U[0,θ] distribution

Let $X_1,X_2,\dots$ be a sequence of i.i.d. uniform random variables on $[0,\theta]$. Let $X(n)=\max_{1≤k≤n} X(k)$. The distribution of $X(n)$ is

$$P(X(n)≤x)=(x/θ)^n.$$

(Note that by a very simple argument, this actually also shows that $X(n) \to \theta$ in probability, and even, almost surely, if the random variables are all defined on the same space.)

1) I understand the first bold statement as that as $X(n) \to \theta$, $P(X(n)≤x) \to 1$. Am I right?

2) But then, what does the "almost surely" part mean?

3) Also, it says "if the random variables are all defined on the same space." But is it possible otherwise if we were told: "Let $X_1,X_2, \dots$ be a sequence of i.i.d. uniform random variables on $[0,\theta]$"? Thus, all random variables come from the same sample space, don't they?

1

There are 1 best solutions below

9
On BEST ANSWER

If you are not already familiar with the measure-theoretic definition of "probability space", I would suggest you look up the Kolmogorov axioms for probability before continuing with this answer.

$X_n \to \theta$ in probability means that for all $\epsilon>0$, $P(|X_n-\theta|>\epsilon) \to 0$ as $n \to \infty$. $X_n \to \theta$ almost surely means that $P \left ( X_n \to \theta \right )=1$. The latter is stronger. Convergence almost surely only makes sense if $X_n$ are defined on the same probability space, because it requires a notion of pointwise limits (so you need to fix a $\omega \in \Omega$ and then look at the existence/value of $\lim_{n \to \infty} X_n(\omega)$).

If the random variables have a joint distribution (e.g. if they are independent), then they must indeed be defined on the same space. But you could have a sequence of $U[0,\theta]$ random variables each defined on their own probability space. There just would be no joint distribution in this circumstance.

As for understanding the text at hand, it seems to boil down to: "the pdf of the $n$th maximum is this. Given any sequence of random variables with this sequence of pdfs, they converge in probability to $\theta$. If they are defined on the same probability space, then they also converge almost surely". In other words, in the middle they seem to be forgetting that in context, to make sense of the order statistic in the first place, the random variables already must be defined on the same probability space. All of the individual statements are correct, but this loss of context makes the presentation rather confusing.