Many books on stochastics take ample time to explain what it means for a sequence of random variables to convergence a.s., in $L_p$, in probability, in distribution, what $\limsup$ and $\liminf$ mean, etc.
However as far as I can tell, the issues of limits is usually brushed over when one passes from discrete to continuous time. Things like the difference between (I don't consider this good notation btw.) $M_n \to M_\infty$, a.s. and $M_t\to M_\infty$, a.s. seem rather elusive. Perhaps the idea is that these things should be known from analysis; I'm not sure.
What book or notes specifically address the fundamental issues of limits of stochastic processes in continuous time?
Specifcally, I mean things like just the definitions of e.g. $\limsup_{t\to \infty} X_t$ and basic properties (equivalent definitions, relations between the modes of convergence).
I'm not looking for advanced results, like e.g. a martingale convergence theorem.
Let me try to specify what I think would make sense as definitions.
Let $f : [0,\infty) \to \mathbb R$ and $l\in \mathbb R$. Then $f(t) \to l$ if the net $f$ converges to $l$, i.e.
$$\forall \varepsilon > 0 : \exists c > 0 : \forall t > c : |f(t) - l| < \varepsilon$$
We also write $\lim_{t\to\infty}f(t) := l$ then.
We let
$$\limsup_{x \to \infty} f(x) = \lim_{x\to \infty} \sup_{y\geq x} f(x)$$
$$\liminf_{x \to \infty} f(x) = \lim_{x\to \infty} \inf_{y\geq x} f(x)$$
Given a stochastic process $(X_t)_{t\in [0,\infty)}$ we say that $X_t \to X_\infty$ a.s for some RV $X_\infty$ if
$$\{X_t \to X_\infty\} = \{ \omega \in \Omega : X_t(\omega) \to X_\infty(\omega)\}$$
has measure $1$.
We say that $X_t \to X_\infty$ in $L_p$ if it converges as net in $L_p$ i.e.
$$\forall \varepsilon > 0 : \exists c > 0 : \forall t > c : \|X_t - X_\infty\|_p < \varepsilon$$
We say that $X_t \to X_\infty$ in probability if for all $\varepsilon > 0$ we have $$\mathbb P(|X_t - X_\infty| > \varepsilon) \to 0$$
Good so far?
Then I'm interested in basic properties (like people discuss in the discrete-time case). Consider (as an example!) the following statement from discrete time:
$X_n \to X_\infty$ if and only, if
$$\sum_{n\geq 1} \mathbb P(|X_n - X_\infty|>\varepsilon) < \infty$$
Does this work in continuous time?
Another example: in a proof of the martingale convergence theorem is is argued that: "
$$\mathbb P\left( \bigcup_{\lambda > 0} \bigcap_{m\geq 1} \bigcup_{t\geq m} |M_t - M_\infty| > \lambda\right) = 0,$$
hence $M_t \to M_\infty$ a.s.".
Why is that? It's probably a simple argument. My point is I want to have a resource where I can look up "simple" things like that.
Probability and Potential by Dellacherie and Meyer has an entire chapter on continuous time martingales, their a.s. convergence, uniform integrability, optional sampling in continuous case etc.