If $\forall s\in\mathbb R^d, \forall F \in \mathcal F\ E[e^{i<s,X>}I_F]=E[e^{i<s,X>}]P[F]$ then $X$ is independen of $\mathcal F$

40 Views Asked by At

The claim in the title seems very plausible since the characteristic function "characterizes" or determines the distribution of $X$, but I don't know how to derive it. There is a similar result for the characteristic functions of two Random variables, eg here, but I'm not sure if it could be deduced from that.

Any help would be appreciated!

2

There are 2 best solutions below

4
On BEST ANSWER

If $Z$ is any bounded function measurable with respect to $\mathcal F$ then $Z$ is a uniform limit of simple functions measurable with respect to $\mathcal F$. From this it follows that $Ee^{i\langle x, X \rangle } e^{i\langle y, Y \rangle } =Ee^{i\langle x, X \rangle } Ee^{i\langle y, Y \rangle}$ for all $x,y$ and for all $\mathcal F$ measurable random variable $Y$. It follows that $X$ is independent of $Y$ for all $\mathcal F$ measurable random variable $Y$.

0
On

Reading my own post one year later I would supply the following answer. From Kac's theorem, to prove independence of $X $ and $1_F $ it would be sufficient to prove that for any $s,t \in \mathbb R^d $ we have that

$$E[e^{i \langle (X, 1_F), (s,t) \rangle}]= E[e^{i \langle X , s \rangle } ]E[e^{i \langle 1_F , t \rangle } ] $$

But since $e^{i \langle 1_F , t \rangle } = 1_{F^C } + 1_F e^{i\sum_{i=1} ^d t_i} $ and $e^{i \langle (X, 1_F), (s,t) \rangle}= 1_{F^C } e^{i \langle X, s \rangle} + 1_F e^{i \langle X, s \rangle}e^{i\sum_{i=1} ^d t_i} $ - from the fact that expectation is additive and we may "move out" constants proving the condition in the title of the question would be sufficient.