what's intuition of $\sigma(X_1, X_2,...)$ in asymptotic analysis?

58 Views Asked by At

In asymptotic analysis, we consider countable many random variables $X_1, X_2,...$ and $\sigma(X_1,X_2,...)$ means the smallest sigma field containing the information of $X_1, X_2,...$.

So my question is that, since $\sigma(X_1)\subset\sigma(X_1,X_2)\subset...\subset\sigma(X_1,X_2,...)$.

Does it means that $\sigma(X_1,X_2,...)$ is a infinity large sigma field, and when we consider $X_1,X_2,...$ we have infinity "information"?

If not, at which point does this increasing sigma field converge? Or when does this increasing sigma field stop?

1

There are 1 best solutions below

4
On

Yes, $\sigma(X_1,X_2,...)$ is indeed infinitely large. It contains all of the information obtained by observing the whole $X_1,X_2,...$ sequence, so it doesn't in general "stop" and there's no set point where it "converges." The exception would be if the later $X_n$ are eventually all functions of earlier $X_n$; then they don't add any new information. For example, if $X_n = (X_1)^n$ for $n > 1$, then $\sigma(X_1,X_2,...) = \sigma(X_1)$.