Markov chain: I don't understand how read this matrix

121 Views Asked by At

Consider a gambling game in which on any turn you win \$1 with the probability $p=0.4$ and you loose \$1 with the probability $p=0.4$. We have that $p(i,j)=p\{X_{n+1}=j\mid X_n=i\}$ and thus, for $N=5$,

$$\big(p(i,j)\big)_{0\leq i,j\leq 5}=\begin{pmatrix}1.0&0&0&0&0&0\\0.6&0&0.4&0&0&0\\0&0.6&0&0.4&0&0\\0&0&0.6&0&0.4&0\\0&0&0&0.6&0&0.4\\0&0&0&0&0&1.0\end{pmatrix}.$$

I don't understand why $p(0,0)=1=p(N,N)$. Morevove, I don't see how to read this matrix. Is for example $p(3,4)=p\{X_4=4\mid X_3=3\}$ ?

And is $p(0,0)=p\{X_0=0\mid X_0=0\}$ and $p(5,5)=p\{X_5=5\mid X_5=5\}$. My interpretation looks strange...


EDIT

Since I don't have any answer from Tlön Uqbar Orbis Tertius, let see this example: Let $X_n$ be the wether on the day $n$. The weather it's not exactly a Markov chain, but we'll suppose that it is. Let $1=rainy$ and $2=sunny$. How would you interpret $$\begin{array}{ccc}&\boldsymbol 1&\boldsymbol 2\\ \boldsymbol 1&.6&.4\\\boldsymbol 2&.2&.8\end{array} \ \ \ \ ?$$ Does $p(1,2)=p\{X-2=2\mid X_1=1\}$ ? If yes, how to compute $p(n,n+1)$ if $n>2$ ? Because here we just have the results for $n\in\{1,2\}$.

3

There are 3 best solutions below

0
On BEST ANSWER

Let's look at your weather example:

Let $X_n$ be the wether on the day $n$. The weather it's not exactly a Markov chain, but we'll suppose that it is. Let $1=Rainy$ and $2=Sunny$. How would you interpret $$\begin{array}{ccc}&\boldsymbol 1&\boldsymbol 2\\ \boldsymbol 1&.6&.4\\\boldsymbol 2&.2&.8\end{array} \ \ \ \ ?$$ Does $p(1,2)=p\{X_2=2\mid X_1=1\}$ ? If yes, how to compute $p(n,n+1)$ if $n>2$ ? Because here we just have the results for $n\in\{1,2\}$.

My Explanation:

This means that if you are in state 1 ($Rainy$) at time $n$, there is a 0.6 probability you will be at state 1 ($Rainy$) at time $n + 1$ and a 0.4 probability that you will be at state 2 ($Sunny$) at time $n + 1$.

A Discrete Time Markov Chain describes the evolution of a system (defined by a state space $S = \{s_1, s_2, ..., s_N\}$) over discrete time periods $0, 1, 2, ..., T$.

$X_n$ is the state that your system is in (in your example it is either 1 ($Rainy$) or 2 ($Sunny$)) at time $n$.

Helpful Information Related to Your Question:

$p\{X_2=1\mid X_1=1\}$ = $p\{X_3=1\mid X_2=1\}$ = $p\{X_4=1\mid X_3=1\}$ = P(weather goes from $Rainy$ to $Rainy$ in 1 step) = 0.6

$p\{X_2=2\mid X_1=1\}$ = $p\{X_3=2\mid X_2=1\}$ = $p\{X_4=2\mid X_3=1\}$ = P(weather goes from $Rainy$ to $Sunny$ in 1 step) = 0.4

$p\{X_2=1\mid X_1=2\}$ = $p\{X_3=1\mid X_2=2\}$ = $p\{X_4=1\mid X_3=2\}$ = P(weather goes from $Sunny$ to $Rainy$ in 1 step) = 0.2

$p\{X_2=2\mid X_1=2\}$ = $p\{X_3=2\mid X_2=2\}$ = $p\{X_4=2\mid X_3=2\}$ = P(weather goes from $Sunny$ to $Sunny$ in 1 step) = 0.8

The above is defined by the matrix you provided. Hope this helps!

3
On

When you're ruined, you're definitely ruined. That's what says $p(0,0) = 1$ : if your personnal wealth is zero, you can't gamble anymore, you stay at the state $0$. Now let $N$ be the fortune of the casino. If you earned $N$, then the casino is ruined, and you can't gamble anymore too : that's the meaning of $p(N,N) = 1$. You've earned everything, so the game is done, you stay at the state $N$. These two states are called "absorbing", because once you reach them, you can't get out.

And your interpretation of $p(3,4)$ is the right one : in general, $p(x,y)$ is the probability to go at the state $y$, knowing you're now at the state $x$.

0
On

It depends on if we assume row vectors or column vectors. If using a column vector $(p_s , p_r)^T$ of todays probabilities, then 0.6 is the probability that tomorrow is sunny given that today is sunny, 0.4 is the probability that tomorrow is rainy given that today is sunny. 0.2 that tomorrow is sunny if today is rainy and 0.8 that tomorrow is rainy if today is rainy.

If we assume row vectors $(p_s,p_r)$ to multiply from the left, however.. the matrix will be transposed to have the same meaning because of the matrix algebra for transposition of matrix multiplication: $({\bf Ap})^t = {\bf p}^t{\bf A}^t$