Is construction of following Markov Chain possible?

78 Views Asked by At

A proof in the book 'Markov Mixing, Levin, Peres et al' (proposition 5.7) uses the following fact:

Let $\chi$ be finite. Given some sequence of random variables $ \{ S_t \}_{t \in \mathbb{N}} $ on $\chi$, and a $\chi \times \chi$ transition matrix $Q$ we can construct a time homeogenous Markov Chain $ (X_t) $ with transition matrix $Q$, s.t. forall $t$, $S_t$ is independent to the Markov Chain $ (X_t) $.

I haven't formally studied measure theory, so I only know bits and pieces. Here is where I am stuck:

I know that given any countable sequence of pmf's on a finite state space $\chi$ i.e. $\mu_0$, $\mu_0 Q$, $\mu_0 Q^2$ ... , we can always construct a set of random variables $(X_t)$ where $X_t$ has distribution $\mu_0 Q^t$ by the Kolgomorov Extension Theorem. I am having trouble with the independent part - how can we construct $(X_t)$ so that they are always independent to $S_t$ forall $t$?

2

There are 2 best solutions below

1
On BEST ANSWER

Suppose the variables $\{S_t\}$ are defined on a probability space $\Omega_1$.

Proposition 1.5 (page 6) in the book you mention describes how to construct a random mapping representation using i.i.d. uniform variables for a finite Markov chain. This means you can define the chain $\{X_t\}_{t=0}^\infty$ on an infinite product space $\Omega_0= [0,1]^{\mathbb N}$ endowed with product measure and the product $\sigma$-algebra, where the coordinates are equipped with Lebesgue measure. (This is more elementary than invoking the Kolmogorov extension theorem.) To get the independence you seek, simply consider the product space $\Omega_0 \times \Omega_1$, endowed with the product measure and product $\sigma$-algebra.

[1] https://www.yuval-peres-books.com/markov-chains-and-mixing-times/

5
On

Okay difference is now we start by constructing a joint probability space containing both the given {S_t} and the Markov chain {X_t}. Independence then follows from the Markov property of the chain.

Here's an attempt of the updated pseudocode: Given: {S_t} is a sequence of random variables

Construct joint probability space (Ω, F, P) = new joint probability space containing both {S_t} and {X_t}

Set up Markov chain on this space X0 ~ initial distribution for t = 1 to ∞: Xt ~ transition(X_{t-1})

Show X_t and S_t are independent This follows from the Markov property: X_t only depends on X_{t-1}, not on S_t