How to interpret the definition of random variables in i.i.d random variables and law of large numbers?

43 Views Asked by At

I am learning stochastic process and I have a question about the definition of random variables:

As a brief non-rigorous definition, a random variable is a function mapping a sample space $\Omega$ to some real numbers and we may write it as $X(\omega),\omega\in \Omega$. Thus we may regard a random variable as a method of "numbering", right?

If it is so, how to interpret the independent and identically distributed random variables? I mean, if $X_1$ and $X_2$ are i.i.d, are they mapping the same sample space $\Omega$? Or they are mapping different sample spaces $\{\Omega_1,F_1, P_1\}$ and $\{\Omega_2,F_2, P_2\}$ respectively and just get the same distribution? It seems difficult for me to understand i.i.d from the "mapping" vision.

Moreover, I met similar problem when I learnt the law of large number (weak form). The definition of it is:

$$\sum_{i=1}^n\frac{X_i}{n}=\bar{X}_n$$

$$\forall \epsilon>0,\lim_{n\rightarrow\infty}P(|\bar{X}_n-E(\bar{X}_n)|<\epsilon)=1$$

I cannot understand the addtion of these random variables in this law. If a random variable is a mapping function, does it mean that if I add up many functions, I will get a function with constant outcome ($E(\bar{X}_n)$) ? I guess that someone may explain it to me from the view of "picking numbers" or "flipping coins", but these are not based on the actual definition of random variables, right? Random variables are just functions without any relationship with probability. The probability is just attached to the sample space. Would you please be so kind to help me from the "function" vision? Have I made any mistake in the above description ?