I was trying to understand the Markov property of Brownian Motion and in one of the proofs the author claims that the following result
Let $\mathcal{F} \subseteq \mathcal{A}$ be a $\sigma$-algebra and $X:\Omega \to \mathbb{R}^d$ be a random variable such that $$\mathbb{E}(F(X) 1_A) = \mathbb{P}(A) \mathbb{E}(F(X))$$ for all $A \in \mathcal{F}$ and $F$ bounded, continuous. Then $X$ and $\mathcal{F}$ are independent.
I understand that if we know that they were independent then the above equality will be true but i cannot prove why this implies independence if I use the basic definition of independence . Can you give me some hints on how could I go about showing it ?I have thought a lot about it ? But I was unsuccessful in proving it .
Suppose that
$$\mathbb{E}(F(X) 1_A) = \mathbb{P}(A) \mathbb{E}(F(X)) \qquad \text{for all $A \in \mathcal{F}$}. \tag{1}$$
Since
$$e^{i \eta 1_A} = e^{i \eta} 1_{A} + 1_{A^c}\tag{2}$$
for any $\eta \in \mathbb{R}$ we find $$\begin{align*} \mathbb{E}(e^{i \xi X} e^{i \eta 1_A}) &\stackrel{(2)}{=} e^{i \eta} \mathbb{E}(e^{i \xi X} 1_A) + \mathbb{E}(e^{i \xi X} 1_{A^c}) \\ &\stackrel{(1)}{=} e^{i \eta} \mathbb{P}(A) \mathbb{E}(e^{i \xi X}) + \mathbb{P}(A^c) \mathbb{E}(e^{i \xi X}) \\ &\stackrel{(2)}{=} \mathbb{E}(e^{i \eta 1_A}) \mathbb{E}(e^{i \xi X}) . \end{align*}$$
This implies that $X$ and $1_A$ are independent (see this answer for details).
Alternative proof: For any closed set $G$ there exists a sequence of bounded continuous functions $(F_n)_{n \in \mathbb{N}}$ such that $F_n \downarrow 1_G$. Using the monotone convergence theorem, we get
$$\mathbb{E}(1_G(X) 1_A) = \mathbb{P}(A) \mathbb{E}(1_G(X)).$$
Since the closed sets generate the Borel-$\sigma$-algebra, it is not difficult to see that this implies
$$\mathbb{E}(1_B(X) 1_A) = \mathbb{P}(A) \mathbb{E}(1_B(X)) \quad \text{for all $B \in \mathcal{B}(\mathbb{R})^d)$, $A \in \mathcal{F}$}$$
which shows that $X$ and $\mathcal{F}$ are independent.