Does convergence of conditional variance to 0 imply convergence of unconditional variance to 0?

206 Views Asked by At

Let $P := (\Omega, \mathcal{F}, \mathbb{P})$ be a probability space and let $G$ be a sub-$\sigma$-field of $\mathcal{F}$. Suppose $\{X_k\}_{k \in \mathbb{N}}$ is a sequence of square-integrable real-valued random variables, assumed not $G$-measurable, defined on $P$ such that $\operatorname{Var}(X_{k}|G)$ converges in probability to $0$ ($G$ is some sub-$\sigma$-field of $\mathcal{F}$).

Does this imply that $X_k$ converges to $0$ (and what kind of convergence? Almost sure or in probability?). Furthermore, and most importantly for my purposes, does $\operatorname{Var}(X_{k})$ converge to $0$?

1

There are 1 best solutions below

1
On BEST ANSWER

Consider $X_k = X + a_k,$ where $X \in \mathscr{L}^2(\Omega, \mathscr{G}, \mathbf{P})$ and $a_k \to 0$ is a sequence of real numbers. Then, $$ \mathbf{V}(X_k \mid \mathscr{G}) = \mathbf{E}(X_k^2 \mid \mathscr{G}) - \mathbf{E}(X_k \mid \mathscr{G})^2 = 2a_k X + a_k^2 \to 0 $$ almost surely, a fortiori, it also converges to zero in probability. Clearly, $X_k \to X$ almost surely, and you can take $X$ to be a nonzero random variable (e.g. Gaussian).