A Markov chain problem in Ross's Introduction to Probability Models(11th edition)

73 Views Asked by At

In part 4.5.3 of Ross's "Introduction to Probability Models" (11th edition), there is a Markov chain with the following transition probability:

"Consider a Markov chain with states $0, 1, ..., n$ having $${P_{0,1}=1}, \; P_{i,i+1}=p, P_{i,i-1}=q=1-p, \; 1\le i<n $$

and for this Markov chain, let $N_i$ denote the number of transitions that it takes the chain when it first enters state $i$ until it enters state $i+1$. Then it says that according to the Markovian property, we can have that all the random variables $N_i$, $i = 0,...,n-1$ are independent."

And here comes my first question: how can we know these random variables are independent by Markovian property since the Markovian property only tells us that the future events are independent with the past events when given the present events.

Then the book let $\mu_i = E[N_i]$ and it can obtain the following equation by conditioning on the next transition after the chain enters state $i,i = 1,...,n-1$:

$$\mu_i=1+E[\text{number of additional transitions to reach} \;i+1|\text{chain to}\; i-1]q$$

My second question is why the first term here is $1$ rather than $p$, because according the principle of conditioning, the equation should be $$\mu_i = E[N_i] = E[N_i|\text{chain to}\ i+1]p+E[N_i|\text{chain to}\ i-1]q$$ and since $ E[N_i|\text{chain to}\ i+1] = 1$, so the first term should be $p$.

1

There are 1 best solutions below

6
On

You must revisit what the Markov property is and what it precisely says, there appears to be a misunderstanding. Let's clarify: The Markov property, also known as the Markov assumption or memorylessness property, tells us that the future state of a stochastic process depends only on its current state and is independent of its past states. In other words, the system has no memory of its past and only the present state influences what happens next.

Based on this definition, the dependence of states is solely within the one-step range, and correctly described.

In the given Markov chain, $N_i$ represents the number of transitions it takes for the chain to move from state $i$ to state $i+1$. Since the Markov property states that the future state of a Markov chain depends only on its current state and is independent of its past states, the process of moving from state $i$ to state $i+1$ is independent of the process of moving from state $i-1$ to state $i$ or any other previous transitions.

In other words, the number of transitions it takes to go from one state to the next is determined only by the current state and does not depend on the history of previous state transitions. As a result, the random variables $N_i$ representing the number of transitions between consecutive states are independent of each other.

To your second question, the answer is simple, since the $P_{0,1}$ is given at $0,1$ and norm.