How to prove a random walk on a continuous monoid is a Markov chain?

60 Views Asked by At

$\DeclareMathOperator{\prob}{P}$Consider a measurable monoid $M$ with measurable product and inverse. Let $X_n$ be a sequence of independent random variables in $M$. Define $Z_n = X_{n-1} \dotsm X_0$. Do the $Z_n$ form a Markov chain?

One would like to be able to say the following: $$\begin{align} &\prob(\, Z_{n+1}\in A \mid Z_n=z_n,\, \dots,\, Z_0=z_0 \,) \\ &= \prob(\, X_n z_n \in A \mid Z_n=z_n \,) \\ &= \prob(\, Z_{n+1} \in A \mid Z_n=z_n \,) \end{align}$$ since $X_n$ is independent of $\{\, Z_k \mid k<n \,\}$. Unfortunately there are some technical details which fail; to uncover them let me recall some measure-theoretic probability theory.

Let $ X \colon \Omega \to S $ and $ Y \colon \Omega \to T $ be random variables. We shall write the pushforward probablity measures as $X(\prob)$ and $Y(\prob)$. One defines the conditional probability $ \prob(\, X \in A \mid Y \,) \colon T \to [0,1] $ through the Radon–Nikodym derivative by requiring $ \prob(\, X \in A,\, Y \in B \,) = \int_B \prob(\, X \in A \mid Y \,) \,dY(\prob) = \int_{Y\in B} \prob(\, X \in A \mid Y \,)(Y) \,d{\prob} $ for every measurable $ B \subseteq Y $. It is thereby determined almost surely w.r.t. the pushforward measure $Y(\prob)$. Warning: for fixed $y \in T$, as a function of $A \subseteq S$ these are not a measure; not even almost surely. There is a notion of regular version of the conditional probability which addresses this issue, but I do not think we will need it here.

This definition encompasses that of conditioning w.r.t. an event $E$, by conditioning w.r.t. the event's characteristic function $[E]$: $\prob(\, X\in A \mid E \,) = \prob(\,X\in A\mid[E]\,)(1)$. If $\prob(E) > 0$, this means exactly that $1$ has positive measure w.r.t. the pushforward measure $[E](\prob)$, and so $\prob(\,X\in A\mid[E]\,)(1)$ is uniquely determined; one can check that it is indeed the familiar quotient $\prob(\{X\in A\}\cap E)/\!\prob(E)$. If however $\prob(E) = 0$ then on the contrary $\prob(\,X\in A\mid[E]\,)(1)$ could be any anything. Reassuringly, one can check more generally that if $T$ has measurable points and $\prob(Y = y) > 0$, then $\prob(X\in A\mid Y)(y) = \prob(\, X\in A \mid Y = y \,)$, where the latter is the conditional probability that $X\in A$ supposing the event $Y=y$.

Going back to the initial question: a sequence $Z_n$ of random variables is a Markov chain if $\prob(\, Z_{n+1} \in A \mid Z_n \times\dots\times Z_0 \,)(z_n,\dots,z_0) = \prob( Z_{n+1} \in A \mid Z_n \,)(z_n)$ for almost all $(z_n,\dots,z_0)$ w.r.t. $Z_n(\prob)\times\dots\times Z_0(\prob)$. Now we can try to justify the initial naïve reasoning, supposing points are measurable: indeed when $\prob(\,Z_n=z_n,\,\dots,\,Z_0=z_0\,) > 0$, we can assert $$\begin{align} &\prob(\, Z_{n+1}\in A \mid Z_n\times\dots\times Z_0 \,)(z_n,\dots,z_0) \\ &= \prob(\, Z_{n+1}\in A \mid Z_n=z_n,\, \dots,\, Z_0=z_0 \,) \\ &= \frac{\prob(\,Z_{n+1}\in A,\,Z_n=z_n,\,\dotsc\,)}{\prob(\,Z_n=z_n,\,\dotsc\,)} \\ &= \frac{\prob(X_nz_n\in A)\prob(\,Z_n=z_n,\,\dotsc\,)}{\prob(Z_n=z_n,\,\dotsc\,)} \\ &= \frac{\prob(X_nz_n\in A)\prob(\,Z_n=z_n\,)}{\prob(Z_n=z_n\,)} \\ &= \prob(\, Z_{n+1} \in A \mid Z_n=z_n \,)(z_n) , \end{align}$$ because we are dealing with actual events. When $\prob(\,Z_n=z_n,\,\dots,\,Z_0=z_0\,) = 0$ though, we are no longer conditioning with respect to to actual events. If the monoid $M$ is countable then there can only be countably many $(z_n,\dots,z_0)$ for which $\prob(\,Z_n=z_n,\,\dots,\,Z_0=z_0\,) = 0$, so they cannot accumulate to a set of positive measure and we will have have checked the Markov property for almost all tuples with the above. So basically my question is can the sequence $Z_n$ still be shown to be a Markov chain when $M$ is not countable and with measurable points?

1

There are 1 best solutions below

2
On BEST ANSWER

$\DeclareMathOperator{\EE}{E}$You may be complicating this unnecessarily. Let me change your notation slightly and write $Y_n:=X_n\cdots X_1X_0$, where the $X_k$ are mutually independent, as before.

Define $\mathcal F_n:=\sigma\{X_k: k=0,\ldots,n\}$, $n\ge 0$. To say that $(Y_n)$ is a Markov chain simply means that for each $n$, the conditional distribution of $Y_{n+1}$, given $\mathcal F_n$, coincides (a.s.) with the conditional distibution of $Y_{n+1}$ given $\sigma\{Y_n\}$. But by the assumed independence, if $f:M\to\Bbb R$ is bounded and measurable, $$ \EE[f(Y_{n+1})\mid\mathcal F_n]=\EE[ f(X_{n+1}Y_n)\mid\mathcal F_n]=g_{n+1}(Y_n), $$ where $$ g_{n+1}(y):=\EE[f(X_{n+1}y)], $$ is a bounded measurable function of $y\in M$.