A random variable $X$ is $P$-a.s. constant iff $\sigma(X)$ is $P$-trivial - for which state spaces?

151 Views Asked by At

Let $(\Omega,\mathcal F, P)$ a probability space, $(E,\mathcal E)$ a measurable space, and $X\colon\Omega\to E$ a ($\mathcal F$-$\mathcal E$-measurable) random variable.

In the case $(E,\mathcal E)=(\mathbb R,\mathcal B (\mathbb R))$, it is well known that

$$ P(X=c)=1 \quad \text{for some $c\in\mathbb R$} \quad \iff \quad P(A)\in\{0,1\} \quad\text{for all $A\in\sigma(X)$.}$$

The direction "$\Rightarrow$" is trivial and the direction "$\Leftarrow$" follows easily from looking at the distribution function $F(x)=P(X\le x)$. The argument can obviously be extended to the multi-dimensional case, and it seems plausible that one should be able to extend it to other cases in which the concept of a distribution function is no longer meaningful.

Question: Are there (reasonably simple and verifiable) conditions on $(E,\mathcal E)$ under which the above equivalence still holds?

Obviously, one has to assume that $\mathcal E$ contains all singletons. Also, we may get problems if the measure $P$ can only take the values 1 and 0 to begin with (as in the example given below in the answer by GEdgar).

2

There are 2 best solutions below

1
On BEST ANSWER

Assume $E$ is a second countable, Hausdorff topological space. It can be noted that any separable metric space such as $\Bbb R^n$, $C[0,1]$, etc, is included in this class. Let $\mathcal E$ be the Borel $\sigma$-algebra on $E$, write $\Bbb P:=P\circ X^{-1}$ (the pushforward measure of $P$ under $X$), and assume that $\Bbb P(A)\in \{0,1\}$ for every $A\in\mathcal E$. Let $\{U_n\}_{n\ge 1}$ be a countable base of $E$. Define support $C$ of $\Bbb P$ as $$ C=E\setminus \bigcup\{U_k:\Bbb P(U_k)=0\}. $$ We find that $C$ is a closed set with full measure $1$. Now, suppose $C$ contains distinct points $x\ne y$. We can find $\alpha\ne \beta$ such that $x\in U_\alpha, y\in U_\beta$ and $U_\alpha\cap U_\beta=\varnothing$ by separation axiom. Observe that $$ \Bbb P(U_\alpha)=0\implies C\cap U_\alpha =\varnothing $$ leads to a contradiction to that $x\in C\cap U_\alpha$. So we have that $\Bbb P(U_\alpha)=1$, and in the same manner, $\Bbb P(U_\beta)=1$. But this leads to a contradiction as it should be $\Bbb P(U_\alpha)+\Bbb P(U_\beta)\le 1$. This gives $C=\{p\}$ for some $p\in E$, establishing $\Bbb P(\{p\})=1$ as wanted.

1
On

An example.
$\Omega = E$ is an uncountable set.
$\mathcal F = \mathcal E$ the collection of the countable subsets and the cocountable subsets.
$X : \Omega \to E$ the identity function.
Probability measure $P$ defined to be $1$ on co-countable sets, $0$ on countable sets.