"Number of passages from the state $i$": a strange equality.

40 Views Asked by At

Consider a homogenenous Markov chain $\{X_n\,:\, n\in \mathbb N\}$ ($0\in\mathbb N$). The state space is $S$ with $|S|\le |\mathbb N|$ and $i\in S$. Consider moreover the function $1_{\{i\}}:S\longrightarrow\{0,1\}$ such that $1_{\{i\}}(i)=1$ and $1_{\{i\}}(s)=0$ if $s\neq i$ and then define the following random variable:

$$N_i:=\sum_{n=1}^\infty1_{\{i\}}(X_n)$$

Clearly $N_i$ counts (starting from $X_1$) "the passages" of the Markov chain from the state $i$.

I don't understand the following equality that can be found in the book "A course in stochastic - D. Bosq, H.T.Nguyen" (page 57 on my edition):

$$E(N_i|X_0=i)=\sum_{n=1}^\infty P(X_n=i|X_0=i)$$

where $E(\cdot|\cdot)$ is the conditioned expected value. Please give me an explaination.

Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

This follows from the fact that $$ {\rm E}_P[\mathbf{1}_A]=P(A) $$ for any event $A$ and any probability measure $P$. Use this result with $A=\{X_n=i\}$ and $P$ being the conditional probability measure $P(\cdot\mid X_0=i)$.