Law of large numbers for random variable

86 Views Asked by At

I am trying to understand the law of large numbers but maybe mixing some things up.

It says that for a sequence of random variables $X_1, X_2,...,X_n$ which are iid, then $n^{-1} \cdot \sum_1^\infty X_i = \xi$ where $\xi$ is the expected value.

What I don't understand is that, if I come with a fixed $\omega_0 \in \Omega$ then because all random variables are iid I get that $X_1(\omega_0)=X_2(\omega_0)...=X_n(\omega_0)$ which makes no sense because then I just get $1/n \cdot X_1(\omega_0)\cdot n =X_1(\omega_0)$

It only makes sense if I see each random variable as an individuel experiment for some arbitrary $\omega \in \Omega$.

But as I understand a random variable is not an experiment in itself, it is a mapping and my textbook says nothing about which event $\omega \in \Omega$ that is chosen

Some clarification would be appreciated

1

There are 1 best solutions below

2
On BEST ANSWER

Let us consider some simple example where we define independent and identically distributed random variables. Say, we toss a symmetric coin three times. This is the single experiment. What are elementary events here? There are $8$ elementary events in $\Omega$: $$ \Omega = \{HHH, HHT, HTH, HTT, THH, THT, TTH, TTT\}. $$ We can introduce $3$ i.i.d random variables with Bernoulli distribution with success probability $1/2$: $X_1$, $X_2$, $X_3$ associated with 1st, 2nd, 3rd coin tosses. How are they depend on $\omega$? As follows: $$ X_1(\omega) = \begin{cases}1, & \omega \in \{\color{red}{H}HH, \color{red}{H}HT, \color{red}{H}TH, \color{red}{H}TT\},\cr 0, & \omega \in \{\color{red}{T}HH, \color{red}{T}HT, \color{red}{T}TH, \color{red}{T}TT\},\end{cases} $$ $$ X_2(\omega) = \begin{cases}1, & \omega \in \{H\color{red}{H}H, H\color{red}{H}T, T\color{red}{H}H, T\color{red}{H}T\},\cr 0, & \omega \in \{H\color{red}{T}H, H\color{red}{T}T, T\color{red}{T}H, T\color{red}{T}T\}\end{cases} $$ and $$ X_3(\omega) = \begin{cases}1, & \omega \in \{HH\color{red}{H}, HT\color{red}{H}, TH\color{red}{H}, TT\color{red}{H}\},\cr 0, & \omega \in \{HH\color{red}{T}, HT\color{red}{T}, TH\color{red}{T}, TT\color{red}{T}\}.\end{cases} $$ You can see that it is not true that for any $\omega$ the values $X_i(\omega)$ coincide. Say, $X_1(HTH)=1$, $X_2(HTH)=0$, $X_3(HTH)=1$.

Another simple example: you can choose a point at random in the unit square $[0,1]\times[0,1]=\Omega$. Then all points $\omega=(X,Y)$ in this square are elementary events. We can define two r.v. $X_1(\omega)=X$, $X_2(\omega)=Y$ as coordinates of a point in the square. $X_1$ and $X_2$ are i.i.d. and have uniform $U(0,1)$ distribution. And only for the points $\omega$ from the diagonal of the square we get $X_1(\omega) = X_2(\omega)$.