Probability of a particular event on a Markov chain

49 Views Asked by At

Here is something I don't understand, and I would appreciate very much if you could help me out clarifying it.

Take $Z_1, Z_2, \dots, Z_t$ as a finite state time homogeneous ergodic Markov chain. I will denote by $\tau = \tau(Z_1, Z_2, \dots, Z_t) = \{s \geq 1 : Z_s = k\}$ the first time the chain hits the state labelled $k$. Can I really define this random variable ? Is it not a problem that it may not be defined if we do not visit $k$ ?

We will also define the event $E = E(Z_1, Z_2, \dots, Z_t) = 1\{\exists s \in [t-1]:Z_s = k\}$ where state $k$ was hit.

If I can write: $$Pr\left( Z_{\tau + 1} = z \text{ and } E \right)$$ (we know it was hit once). Then can't I also write

$$Pr\left( Z_{\tau + 1} = z \text{ and } E \right) = Pr\left( E | Z_{\tau + 1} = z\right) Pr(Z_{\tau + 1} = z)$$

but isn't there something odd about the second factor ? Not being in the event $E$, $Z_{\tau+1}$ may not even be defined ? What do I break here ?

1

There are 1 best solutions below

0
On

A Markov chain is ergodic if all states are ergodic, meaning that each state has a finite mean recurrence time. Hence, the every state will be visited at some point in time and $\tau = \inf\{ s \geq 1 ~:~ Z_{s} = k\}$ is well defined.

I do not understand the second part of your question, so I can not help you there.