Let $X=(X_n)_{n\ge 1}$ be a (time-homogeneous) Markov chain on a countable state space $S$. I wonder if $X$ is irreducible, recurrent and aperiodic, then necessarily $P(X=s)=P(X_1=s_1,X_2=s_2,\ldots)=0$ for any nonrandom sequence $s=(s_1,s_2,s_3,\ldots)\in S^\infty$?
Probability of a Markov chain pursuing an infinite single path
688 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
The other answers confine themselves to a chain with a finite number of states and have thoroughly addressed the situation as long as the transition matrix is the same for all steps. However if the number of states is countably infinite, there is a different story. Let $p_{j,k}$ denote the transition probability from state $j$ to state $k$. For this example let $p_{k,2k}=1-\frac{1}{2^k}$. Then the sequence of states $(X_1=1,X_2=2,X_3=4,X_4=8,...)$ has a probability $P=\frac{1}{2}\frac{3}{4}\frac{7}{8}...$. Now $P\gt 0$ is equivalent to $ln(P)$ is finite.
$ln(P)=\sum_{k=1}^{\infty} ln(1-\frac{1}{2^k})$, which can be approximated by $\sum_{k=1}^\infty \frac{-1}{2^k}=-1$ or $P\approx \frac{1}{e}\gt 0$.
On
I think I can prove it in the finite state space case.
Here's a sketch of the idea:
Suppose that $s$ is a such a sequence, with $P(s) > 0$. [Edit: As I've learned from Jack M's answer, the right replacement for PHP in this infintie state space is to use the assumption that $X$ is recurrent.]
By pigeon hole principle, there must be some state, say $a_1$ that is visited infinitely often.
Our sequence thus appears as $****a_1 ******* a_1 ****** \ldots$
Let $p = max_{y} P(a_1, y)$. Then each of those blocks has probability $\leq p$ of occurring. Thus, provided that $p \not = 1$, the sequence has zero probability, which is a contradiction.
Thus $p = 1$ and this means that in our Markov chain there is a single outward transition from $a_1$ that occurs with probability one. Thinking of our Markov chain as a directed graph, we can contract this edge, deleting all occurrences of $a_1$ from $s$, and $P(s) > 0$ remains. (Obviously you have to be careful about this step -- in particular, you need to make sure that you add transition probabilities appropriately when you contract.)
(Edit: I think you can handle this without contractions. Once you have determined that there is one transition out of $a_1$, apply the pigeon hole principle again to find another vertex $a_2$, and think of $s$ as a sequence of blocks starting with $a_2$ (after an initial $*****$). We then end up with a dichotomy that either $P(s) = 0$, or $a_2$ had a single edge out of it. I think you can continue in this way until you have 'uncovered' all the vertices that appear in $s$. Since the vertices along it all have a single neighbor they can go to, you will end up with the vertices of $s$ living on a cycle in the underlying directed graph of $X$. Since the chain is irreducible, this cycle had to be the entire chain, which contradicts aperiodicity.)
Continuing on in this way, we either end up with some time when $p < 1$, or we end up with a chain with one element. However, if we end up at that point, our original chain had to be a directed cycle (every transition is either $1$ or $0$, and its connected), which is periodic.
Remark: I tried to get some information from the Ergodic theorem, which tells you : $\lim_{n \to \infty} \frac{1}{n} \sum_{i = 1}^n f(X_i) = \mathbb{E}_{\pi} (f)$, where $\pi$ is the stationary distribution. However, I had no luck with this. Maybe there is a strategy along these lines though.
I will drop the assumption of aperiodicity because, as we will see, even without that assumption the claim is almost true.
This is not true in a finite state space: we could have a degenerate Markov chain consisting of $n$ states connected in a cycle, with probability $1$ assigned to each transition. These are the only counterexamples, as we will see. If the chain is assumed aperiodic, even these counterexamples are ruled out.
Since $X$ is recurrent, any state that appears in $s$ must appear infinitely often in $s$ if $P(X=s)$ is to be non-zero. Also, call a state $x$ degenerate if there is only one state accessible in one move from $x$. If every state in $s$ is degenerate, then combined with the first proposition, $s$ would be periodic and, since the chain is irreducible, that finite period would have to cover the entire state space. Thus we would have one of the counterexamples discussed above. So assume there exists at least one non-degenerate state in $s$, call it $a$.
Let $s_{j_i}$ be a subsequence of $s$ identically equal to $a$, and denote $p_i=P(X_{j_i+1}=s_{j_i+1}\mid X_{j_i}=a)$, i.e., the probability that $X$ is where it's supposed to be at stage $j_i+1$ given that it was where it's supposed to be at stage $j_i$. Then $P(X=s)\leq \prod^\infty_0 p_i$, and our construction guarantees the $p_i$ have zero product. In order for an infinite product of values in $[0, 1]$ to not be equal to zero, the sequence of factors must at minimum tend to $1$. But the $p_i$ are values of the probability mass function of the conditional distribution of $X_1$ given $X_0=a$. Since we assumed $a$ is non-degenerate, it's not even possible for more than one of the $p_i$ to exceed $\frac 12$. Therefore their infinite product is $0$ and $P(X=s)=0$.
If we drop the requirement to be recurrent, keeping aperiodic and irreducible, there are counterexamples. Consider a chain on $\mathbb N$ in which $P(n\to n+1)=p_n$ and $P(n\to 0)=1-p_n$. This chain is aperiodic and irreducible as long as all the $p_i$ are strictly between $1$ and $0$, and the probability of following the seauence $s=(1,2,3,4,...)$ is $\prod p_n$. Just choose the $p_n$ such that that product is non-zero.