Markov Chain Construction

361 Views Asked by At

enter image description here

For part a) I'm a bit confused on how the long term would effect the construction. Would I just draw a state diagram with the probabilities stemming from A. Or do I have to find the fixed point probabilities?

For part b) I'm not sure what the question is actually asking, could someone explain the notation to me?

Thanks in advance!

1

There are 1 best solutions below

0
On

For part (a), what the question stem gives, is the asymptotic behaviour of the markov chain. From what I understand, the notation $p_{\infty}(A \to A)$ is the probability that starting at $A$, the asymptotic probability that we will be at $A$, that is, the limiting probability of being at $A$ on the $n$th step. Note that for a markov chain with transition matrix $P \in \mathbb{R}^{4\times 4}$ and asymptotic distribution $\pi \in \mathbb{R}^4$ (assuming it exists) satisfies,

$$P^T \pi = \pi $$

In our case the asymptotic distribution is given by,

$$\pi = \left[\begin{matrix}0.5 \\ 0.2 \\ 0.2 \\ 0.1 \end{matrix} \right] $$

Now, our task is to find the transition matrix $P$, from that, we can construct the markov chain. Of course, there is no unique solution, so we need to make some choices. Perhaps the easiest thing to do right now is to decide on the structure of the MC. Note that the asymptotic distribution is non-zero for any entry, meaning the MC is irreducible (so we can go to any vertex from any other vertex).

One way that comes to mind is simply making a cycle graph with the following transition matrix,

$$P = \left[ \begin{matrix}a & 1-a & 0 & 0 \\ 0 & b & 1-b & 0 \\ 0 & 0 & c & 1-c \\ 1-d & 0 & 0 & d \end{matrix} \right] $$

For some parameters $a, b, c, d$. You put this into the equation with the given $\pi$ (don't forget, you use $P^T$ and not $P$!) then try to find a solution to that linear system. I will let you do the details but from what I found, the equations reduce down the following two,

$$b = c $$

$$0.5a - 0.1d = 0.4 $$

I chose $b = c = 0.6$, $d = 0.2$ and $a = 0.84$. You should do the equations yourself and find your own solution. An interesting extension question is to ask about the kinds of probability distributions can be given to the cycle graph, or to ask how the structure of the graph has in relation to the probability distribution.

For part (b), what this is asking is to add another vertex to your graph such that if we started at $E$ we would never return to $E$. You can accomplish this by adding the vertex $E$ and an edge to one or more of $A, B, C, D$ but no return edge. That way, whenever the MC leaves $E$ it will never return. To ensure that with positive probability it could remain at $E$, you can give it a self loop. Try to convince yourself intuitively that this construction gives what you need. The transition matrix could be,

$$P = \left[ \begin{matrix}a & 1-a & 0 & 0 & 0 \\ 0 & b & 1-b & 0 & 0 \\ 0 & 0 & c & 1-c& 0 \\ 1-d & 0 & 0 & d & 0 \\ 1-e & 0 & 0 & 0 & e \end{matrix} \right] $$

For example, starting at $E$ the probability that it will be at $E$ the next turn will be $e$, and the probability that it will be at $E$ in the 2nd turn (i.e. $p_2(E \to E)$) is $e^2$ and so on.