Expectation of staying in same state for a simple MC

380 Views Asked by At

Consider a simple dicrete-time Markov Chain $X_t$ with finite state $\Omega = \{1,2,3\}$. At time 0 the chain is with probability 1 in state 1 $\mathbb{P}(X_0 = 1) =1$. Then the transition probability matrix is as follows \begin{array}{ccc} p_{11} & p_{12} & 0 \\ 0 & p_{22} & p_{23}\\ 0 & 0 & 1 \end{array} I would simply like to know the expectation of the time the chain stays in each state. For state 1 is it simply $\mathbb{E}(T_1) = \sum_{k > 0} k p_{11}^k$ ? Thank you !

1

There are 1 best solutions below

6
On BEST ANSWER

Let $T_1 = \max\{i, X_i = 1\}$

$$P(T_1 = k) = p_{11}^{k}(1-p_{11})$$ $$E(T_1) = \sum_{k=1}^{+\infty}kP(T_1 = k)$$

You've forgotten the term $1-p_{11}$. Of course the result can be simplified if you want