Finding the transition probability matrix

161 Views Asked by At

$$P(ξ_i = k) = 1/m$$ for $$k = 1, 2, . . . , m.$$

Explain why $(X_n)_{n≥0}$is a Markov chain.

Write down the state space and the transition probability matrix of $(X_n)_{n≥0}$.

1

There are 1 best solutions below

5
On BEST ANSWER

As discussed in the comments, the states of the Markov chain are the possible remaining lifetimes of the battery in hours, namely, $1, 2, \ldots, m$.

As usual, the entries $T_{ij}$ of the transition matrix are the probability that a system in state $j$ will transition during a given iteration into state $i$. (If you use the reverse convention, in which case state vectors are row vectors and not column vectors, just take the transpose of all of the matrix objects.)

  • If at some time the state battery has $k > 1$ hours of life left, then the behavior is deterministic: The battery is not replaced, in which case when the system iterates, the battery will have $k - 1$ hours left. Thus, for $k > 1$, $T_{k - 1, k} = 1$ and $T_{ik} = 0$ for $i \neq k - 1$.
  • On the other hand, if the remaining battery life is $1$ hour, then when the system iterates, the battery is replaced with a new battery of uniformly distributed lifetime, i.e., for all $k = 1, \ldots, m$ the length of the battery's life has a probability $\frac{1}{m}$ of being $k$ hours. Thus for all $k$, $T_{1k} = \frac{1}{m}$. Putting this all together gives $$\color{#df0000}{\boxed{T = \pmatrix{\frac{1}{m} & 1 \\ \vdots & & \ddots \\ \frac{1}{m} &&& 1 \\ \frac{1}{m}}}}.$$