Probability measure associated with random vector and its components

1.1k Views Asked by At

As you know, the definition of random vector is the following:

Let $(\Omega, \mathcal{F})$ be a measurable space. Then function $\mathbf{X}: \Omega \to \mathbb{R}^n$ is called random vector on $(\Omega, \mathcal{F})$ if it is measurable-$\mathcal{F}$, i.e. $~\{\omega: \mathbf{X}(\omega) \in B\} \in \mathcal{F}, ~~ \forall B \in \mathcal{B}(\mathbb{R}^n)$.

Here $\mathcal{B}(\mathbb{R}^n)$ is Borel sigma-algebra on $\mathbb{R}^n$.

And it is well-known fact that every component $X_i: \Omega \to \mathbb{R}$ of random vector $\mathbf{X} = (X_1, \ldots, X_n)$ is also measurable-$\mathcal{F}$ function on $(\Omega, \mathcal{F})$.

But there are no words about measure in the aforementioned definition. Let's define measure $P$ on $(\Omega, \mathcal{F})$ and get a probability space $(\Omega, \mathcal{F}, P)$. After that let's define random vector $\mathbf{X}$ on this "domain" probability space.

Is it correct to assume that components $X_1, \ldots, X_n$ must be defined on the same probability space $(\Omega, \mathcal{F}, P)$ as vector $\mathbf{X} = (X_1, \ldots, X_n)$ ?

I said above that $\mathbf{X}$ and its components $X_1, \ldots, X_n$ must be defined on the same measurable space $(\Omega, \mathcal{F})$. But if we consider probability space instead of measurable space, then should this space be the same for $\mathbf{X}$ and every component $X_i$?

P.s. Keep in mind that I don't talk about distributions of $\mathbf{X}$ and $X_i$ (i.e. about induced measures). I know that they can be different. I am talking about the "domain" measure $P$ on $(\Omega, \mathcal{F})$.


EDIT
Indeed, in few probability textbooks and wikipedia there is a requirement that components $X_i$ must be defined on the same probability space as each other, $(\Omega, \mathcal{F}, P)$.

Also it is interesting to look at the well-known formula for the random vector $\mathbf{X} = (X_1, \ldots, X_n)$ with independent components. This formula has the following form:

$$P(\mathbf{X} \in B) = \prod_{i=1}^n P(X_i \in B_i).$$

Note that the same measure $P$ is used in both left and right sides of the expression. I think this means that random vector $\mathbf{X}$ and every its component $X_i$ are defined on the same "domain" probability space $(\Omega, \mathcal{F}, P)$.

What do you think about it? Should we always assume that components $X_1, \ldots, X_n$ must be defined on the same probability space $(\Omega, \mathcal{F}, P)$ as vector $\mathbf{X} = (X_1, \ldots, X_n)$ ?

1

There are 1 best solutions below

3
On BEST ANSWER

I'm not sure if my understanding will unravel your confusion. Hope it'll help you.

As I think, it doesn't matter whether you define the components on different probability spaces, because you can always build a big one to include them and divide the big one into several parts.

  1. Consider a random vector $\textbf{X}=(X_1,X_2,\cdots,X_n)$, whose components are defined on different probability spaces $(\Omega_i,\mathcal{F}_i,P_i),\ 1\leq i\leq n$. Define a product probability space on $\Omega=\Pi_{i=1}^n\Omega_i$ by setting $$\mathcal{F}=\sigma\{\Pi_{i=1}^nA_i|A_i\in\mathcal{F}_i\}$$ and we know that there exists a unique probability measure on $(\Omega,\mathcal{F})$ such that $$P(\Pi_{i=1}^nA_i)=\Pi_{i=1}^nP_i(A_i),\ A_i\in\mathcal{F}_i\ .$$ Now we redefine the variables by letting $$Y_i(\omega_1,\omega_2,\cdots,\omega_n)=X_i(\omega_i),\quad \omega_j\in\Omega_j.$$ Here we shall see the value of $Y_i$ is only dependent on $\omega_i$, so actually we can treat $Y_i$ the same as $X_i$. Thus we now have a new probability space $(\Omega,\mathcal{F},P)$. Note that $Y_i(1\leq i\leq n)$ are measurable on $(\Omega,\mathcal{F},P)$, for $$Y_i^{-1}(\mathcal{B}(\mathbb{R}))=\{\Omega_1\times\cdots\times\Omega_{i-1}\times A\times\Omega_{i+1}\times\cdots\times\Omega_n|A\in\mathcal{F_i}\}\subset\mathcal{F}.$$

  2. And we can divide the big probability space into smaller ones by setting $$\Omega_i=\Omega,\ \mathcal{F}_i=\sigma(X_i),\ P_i=P|_{\mathcal{F_i}}\ .$$

I think your definition will contribute to intuitive understanding, but it's all the same in mathematical sense.