Transience in a simple Markov chain

82 Views Asked by At

Consider the following simple game from a textbook called "Competitive Markov Processes" by Filar & Vrieze (Springer 1996).

A simple example of a transient stochastic game

This is a two player game with two states. In the first state (the starting state) each player has two actions to choose from. For instance, in state 1 if both the row player and the column player choose action #1, respectively, then there's a 1/3 chance that the next state will be state 1 and a 2/3 chance that it'll be state 2. In state 2 there's a 1/2 chance that the game will stop; if it doesn't - we stay in state 2. Both players have a fixed strategy for state 1, i.e. each player has a fixed probability distribution over the two possible actions he/she can choose from while in state 1. The picture doesn't represent these strategies.

As can be seen, the textbook claims that this game eventually reaches state 2. I'm sure this is a trivial claim, but I'm very rusty on my Markov Chains, so I would appreciate an explanation why this is so.

I realize that any given infinite sequence that doesn't leave state 1 has probability 0. However, the number of such sequences is not denumerable (for instance, all the sequences $a_1, a_2, \dots$ with $a_n \in {1, 3}$).

1

There are 1 best solutions below

0
On BEST ANSWER

Let $X$ be a Markov chain, let $N_j$ be the total number of visits to state $j$ and let $F(i, j)$ be the probability that, starting at state $i$, the Markov chain $X$ ever visits state $j$ (ever revisits state $j$, if $i = j$). We denote by $P_j\left(N_j < \infty\right)$ the probability that, starting at state $j$, $N_j$ is finite. Then we have (Corollary 2.11 of Çinlar, p. 123): $$ P_j\left(N_j < \infty\right) = \begin{cases} 1 & \text{if }F(j, j) < 1 \\ 0 & \text{if }F(j, j) = 1 \end{cases} $$

Now, the game described in OP's original post is essentially a finite-state Markov chain, and whatever the strategies the players have for state $1$, $F(1, 1) < 1$. Hence, by the corollary stated above, the total number of visits to state $1$ is almost surely finite, i.e. the game eventually reaches state $2$.

Works Cited

Çinlar, Erhan. "Introduction to Stochastic Processes". Dover, 2013