What are some good texts/notes on the construction of probability spaces for stochastic processes?

173 Views Asked by At

Often in probability the underlying probability space is assumed to be implicit. That is, for a probability space $(\Omega,\mathcal F,\mathbb P)$, one tends to write $\mathbb P(E)$ for $E\in\mathcal F$ rather than the precise expression $$ \mathbb P\left(\omega\in\Omega: \omega\in E \right), $$ for a random variable $X:\Omega\to\mathbb R$ it is standard to write e.g. $\mathbb P(X\leqslant x)$ as opposed to $$\mathbb P\left(\omega\in\Omega: X(\omega)\leqslant x\right), $$ and in particular, $\mathbb E[X]$ as $$\int_{\mathbb R} x\,\mathsf dF(x) $$ as opposed to $$\int_\Omega X(\omega)\,\mathsf d\mathbb P(\omega). $$ For discrete distributions, this is a moot point, as $\Omega$ is at most countable and so we may simply take $\mathcal F=2^\Omega$ and fully characterize the distribution of a random variable $X$ by its probability mass function $\{p_n\}$, i.e. if $\{x_n\}$ is an enumeration of $E=X(\Omega)$ then $\{p_n\}$ is the unique positive sequence such that $p_n=\mathbb P(X=x_n)$.

But in more general settings, the sample space $\Omega$, $\sigma$-algebra $\mathcal F$ and probability measure $\mathbb P$ are typically not explicitly defined, as this example. It is well known that the finite sum of i.i.d. Bernoulli random variables has a binomial distribution, but the converse is not true, as shown in this example, in which user @Did gave examples of conditions on the probability space in which a binomial random variable could or could not be written as the sum of Bernoulli random variables.

I recently acquired the texts Topics in the Constructive Theory of Countable Markov Chains by Foayolle et al. and Theory of Markov Processes by Dynkin. These are quite advanced texts, however, so I would appreciate any recommendations for more elementary treatment of this topic - and in particular that are not focused specifically on Markov processes.

1

There are 1 best solutions below

1
On BEST ANSWER

From here:


In Kolmogorov's construction of the probability space for a stochastic process the space of elementary events $\Omega$ is selected as the set of all trajectories $t \to \omega(t)$. The random variable $a_t$ is defined as $$a_t(\omega) = \omega(t)$$ Construction of the probability measure $P$ serving all finite random vectors is a mathematically sophisticated task going back to construction of the Wiener measure on the space of continuous functions.

Thus, the Kolmogorov construction of a stochastic process is based not on "hidden variables"(e.g. the initial conditions) determining in advance future results of measurements, but rather on the results themselves.


And I also hope this PDF will be useful for you. Hope it helps.