Does convergence almost everywhere imply convergence in probability in continuous time?

66 Views Asked by At

Let $\{X_t\}_{t\geq 0}$ a continuous-time stochastic process on a probability space $(\Omega, \mathcal F, \mathbb P)$. Assume $X_\infty := \lim_{t \to \infty} X_t$ exists almost surely. Is it then true that $X_t \to X_\infty$ in probability?

I know that in the discrete case we can argue as following: given $\epsilon>0$, for any $n \in \mathbb N$ the set $A_n := \bigcup_{m \geq n} \{|X_m-X_\infty| \geq \epsilon\}$ is measurable as a countable union of measurable sets; furthermore, $A_n \downarrow \emptyset$ so $\mathbb P(A_n) \to 0$.

But if we index over $[0,\infty)$ instead then clearly this argument does not work, as $A_n$ are not even necessarily measurable any more. We can obviously deduce that $X_{n_k} \to X_{\infty}$ in probability for any sequence $n_k \uparrow \infty$. Is this enough to guarantee that $X_t \to X_\infty$ in probability?

1

There are 1 best solutions below

2
On BEST ANSWER

As you noted, for any sequence $(t_k)\subset[0,\infty)$ increasing to $\infty$ you have

$$X_{t_k}\xrightarrow{k\to\infty} X_\infty \quad \text{$\Bbb P$-a.s.}$$

and hence also

$$X_{t_k}\xrightarrow{k\to\infty} X_\infty \quad \text{in probability under $\Bbb P$.} \tag{1}$$

Now, let $\epsilon>0$ and

$$a_t(\epsilon):= \Bbb P(|X_t-X_\infty|>\epsilon).$$

Due to (1), we know that

$$ a_{t_k}(\epsilon)=\Bbb P(|X_{t_k}-X_\infty|>\epsilon)\xrightarrow{k\to \infty}0.$$

Since the sequence $(t_k)$ was arbitrary, this means that

$$ \Bbb P(|X_t-X_\infty|>\epsilon)=a_t(\epsilon)\xrightarrow{t\to \infty}0.$$

Since this works for every $\epsilon>0$, we have

$$X_t\xrightarrow{t\to\infty} X_\infty \quad \text{in probability under $\Bbb P$.} $$


The main point here is that indeed $X_t\to X_\infty$ in probability if and only if $X_{t_k}\to X_\infty$ in probability for any sequence $(t_k)\subset[0,\infty)$ increasing to $\infty$. The same is true for almost sure convergence.