Conditional uniform convergence in probability

51 Views Asked by At

I am wondering if it makes sense to say that a (uniform) convergence in probability is conditional on a random variable. Or maybe, there is another way to express that.

Assume two real-valued random variables $X_n$ and $Y_n(\theta)$, where $Y_n$ depends on some parameter $\theta \in \Theta$. I know that $\Theta$ is a compact subset of $\mathbb{R}$.

Convergence in probability: I show that, for any $\varepsilon > 0$ and $\theta \in \Theta$, $\mathbb{P}\left(|Y_n(\theta) - \mathbb{E}(Y_n(\theta)|X_n)|>\varepsilon | X_n\right) \to 0$ as $n$ grows to infinity. So, I conclude that $Y_n(\theta)$ converges in probability to $\mathbb{E}(Y_n(\theta)|X_n)$ conditionally on $X_n$. Is that terminology correct?

Uniform convergence in probability: I also show that, for any $\varepsilon > 0$, $Y_n(\theta)$ and $\mathbb{E}(Y_n(\theta)|X_n)$ are continuously differentiable in $\theta$ and their derivative are bounded on the compact $\Theta$. Can I also say that $Y_n(\theta)$ converges uniformly in probability to $\mathbb{E}(Y_n(\theta)|X_n)$ conditionally on $X_n$?