Questions about the random variable when viewing it as a measurable function

101 Views Asked by At

Let $(\Omega,P(\Omega),u)$ be a probability space. Let $\textbf{X}_1,\dots,\textbf{X}_n,\dots$ be a sequence of independent, identically distributed, integrable random variables defined on the previous probability space. Suppose $E(\textbf{X}_i)=m, \forall i=1,2,3,\dots$. The Strong Law of Large Numbers says that for almost every sample point $\omega\in\Omega$, $$ \frac{\textbf{X}_1(\omega)+\dots+\textbf{X}_n(\omega)}{n}\rightarrow m, \text{as} \, n\rightarrow \infty. $$

I'm very confused by the definition of a random variable as a measurable function. If we are given a certain $\omega\in \Omega$, shouldn't $\frac{\textbf{X}_1(\omega)+\dots+\textbf{X}_n(\omega)}{n}$ be a fixed value? Where does the randomness come in the above average formula? Thanks!

1

There are 1 best solutions below

16
On

The randomness comes from $\Omega$. We don't know the outcome of our experiment, so we don't know what $\omega$ we will get. But once we know $\omega$, then $\bf{X_i}(\omega)$ is fixed, not random.