Consider an agent that enters a casino with an integer amount of money $y$ such that $0 < y \leq 5.$ The agent will stop gambling if the agent has either 0 dollars (in which case, the agent cannot play) or 5 dollars. When the agent has 1 or 2 dollars, the agent will bet the entire amount of money that the agent has. If the agent has 3 or 4 dollars, the agent will bet the difference between 5 dollars and the amount the agent currently has (i.e. if the agent has 3 dollars, the agent will bet $5-3=2$ dollars). In each gamble, there is a probability $p$ of winning the bet (and earning the bet) and a probability $1-p$ of losing the bet (and losing that given amount of money). \ It is apparent that we can express this as a Markov Chain, where each state is the given dollar amount the agent has. Could someone help me with classifying the states (i.e. absorbent, periodic, transient, etc.)?
2026-04-18 07:35:45.1776497745
Markov Chains in a Casino
502 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
0 and 5 are absorbing states. Once entered, the process stays there.
For states 1,2,3,4, there is some probability you will immediately win enough bets in a row to get to 5 or lose enough to get to 0 (the absorbing states). Since, for which ever one of these states you start at, there is a non-zero probability of never returning, the states are transient.
Periodicity is a more difficult matter.
For instance, how can we get from state 1 back to state 1? There is only one way. You have to go from 1 to 2 to 4 to 3 back to 1. This is the only way. It is periodic with period 4.
To get from state 2 back to state 2 you can only go from 2 to 4 to 3 to 1 to 2. Period 4.
To get from state 3 back to state 3 you can only go from 3 to 1 to 2 to 4 to 3. Period 4.
Finally (you guessed it). For state 4 you can only go from 4 to 3 to 1 to 2 to 4. Period 4.