Proving an equality for a Markov Chain

78 Views Asked by At

Let $X_n$ be an irreducible Markov chain taking values from the natural numbers (including $0$). Let $g,f$ be functions with $\mathbb{N}$ as domain (including $0$) such that $f = g + Pf$, where $P$ is the transition matrix for $X_n$. A note of notation: $Pf(i) = \sum_j p_{i,j} f(j)$. Prove that $$ f(i) = E_i[f(X_n)] + \sum_{j = 0}^{n-1} E_i[g(X_k)], \ \ \forall n \in \mathbb{N} $$

First, what is meant by $E_i[f(X_n)]$? Is it the expectation of the function $f(X_n)$ given that $X_0 = i$?

If that is correct, then I could write $f(X_n) = g(X_n) + Pf(X_n)$ by hypothesis. Well then, Is it correct that $Pf(X_n) = \sum_j p_{X_n,j} \ f(j)$?

I guess I could prove this by using the linearity of expectation but the notation is really confusing me.