Definition: Convergence in probability on a set

101 Views Asked by At

I know what it mean that a sequence of random variables $(X_n)$ converges to a random variable $X$ in probability. But what does it exactly mean that $(X_n)$ converges to $X$ in probability on a (measurable) set $A$?

1

There are 1 best solutions below

2
On BEST ANSWER

$X_n \to X$ in probability on $A$ if $P(A \cap (|X_n-X| >\epsilon)) \to 0$ as $n \to \infty$ for every $\epsilon>0$.