Constructing a $4$-state Markov chain model that describes the arrival of customers

1.8k Views Asked by At

The times between successive customer arrivals at a facility are independent and identically distributed random variables with the following PMF:

$$p(k) = 0.2(k = 1)$$ $$p(k) = 0.3(k = 3)$$ $$p(k) = 0.5(k = 4)$$ $$p(k) = 0(k \notin \{1,3,4\})$$

Construct a four-state Markov chain model that describes the arrival process. In this model, one of the states should correspond to the times when an arrival occurs.


Can you please explain in simple words how to construct this markov chain? Because I am totally lost with given distribution and how can I use this in my problem.

1

There are 1 best solutions below

0
On BEST ANSWER

The state of the Markov chain the question seems to be asking about is the length of time since the last customer arrived. It can have any of the four values, $\ 0,$$\ 1,$$\ 2,$ or $\ 3\ $. The chain is in state $\ 0\ $ at time $\ n\ $ if a customer arrived at that time, and is otherwise in state $\ i\ne0\ $ at that time if the last customer arrived at time $\ n-i\ $.

If the state at time $\ n\ $ is $\ 0\ $ then there is a probability of $\ 0.2\ $ that the next customer will arrive at time $\ n+1\ $, in which case the chain will remain in state $\ 0\ $, and a probability of $\ 0.8\ $ that the customer will arrive at either time $\ n+3\ $ or $\ n+4\ $, in which case the state at time $\ n+1\ $ will be $\ 1\ $. Thus we have \begin{align} p_{00}&=0.2\\ p_{01}&=0.8 \end{align}

If the chain is in state $\ 1\ $ at time $\ n\ $, then the last customer arrived at time $\ n-1\ $ and none arrived at time $\ n\ $, so the next one won't arrive until either time $\ n+2\ $ or $\ n+3\ $, and the state at time $\ n+1\ $ will be $\ 2\ $ with probability $\ 1\ $. That is, $$ p_{12}=1\ . $$ If the chain is in state $\ 2\ $ at time $\ n\ $, then the last customer arrived at time $\ n-2\ $ and none arrived at time $\ n\ $. There is then a (conditional) probability of $\ \frac{0.3}{0.8}\ $ that the next customer will arrive at time $\ n+1\ $, in which case the state at time $ n+1\ $ will be $\ 0\ $, and a probability of $\ \frac{0.5}{0.8}\ $ that the customer will arrive at time $\ n+2\ $, in which case the state at time $\ n+1\ $ will be $\ 3\ $. Thus we have \begin{align} p_{20}&=\frac{0.3}{0.8}\\ p_{23}&=\frac{0.5}{0.8}\ . \end{align} Here, we must condition our probabilities on the event that the time between the arrival of the last customer and the next was not $\ 1\ $ time unit, because we know that that event has occured.

If the chain is in state $\ 3\ $ at time $\ n\ $, then the last customer arrived at time $\ n-3\ $ and none arrived at time $\ n\ $, so the next one must arrive at time $\ n+1\ $, and the chain will then be in state $\ 0\ $. Thus we have $$ p_{30}=1\ . $$ Putting all this together, we get the following transition matrix for the Markov chain. $$ P=\pmatrix{0.2&0.8&0&0\\ 0&0&1&0\\ \frac{0.3}{0.8}&0&0&\frac{0.5}{0.8}\\ 1&0&0&0}\ . $$