Markov-chain properties

277 Views Asked by At

I have some questions about a Markov-chain $(X_n)$ on a finite state-space $S$ with transition matrix $P$. A function $f:S\rightarrow\mathbb R$ is a columns vector and $Pf$ therefore a matrix multiplication. Now there are three points I am not aware of:

  • $\mathbb E^x$ means that $X_0=x\in S$. Then it holds that $[Pf](x)=\mathbb E^xf(X_1)$

    I honestly do not see where this comes from.

  • Consider an arbitrary function $g:S\rightarrow\mathbb R$. Then $g(X_n)$ is $\sigma(X_n)$-measurable and $g(X_n)$ is $\mathcal F_n$-measurable where $\mathcal F_n=\sigma(X_k:0\le k\le n)$

  • For fixed $f:S\rightarrow \mathbb R$ the process $M_n=f(X_n)-f(X_0)-\sum_{k=0}^{n-1}[(P-I)f](X_k)$ is a martingale with respect to the filtration $\mathcal F_n$

I do not see why this should be true because if $Pf\le f$ then $\mathbb E[f(X_{n+1})|\mathcal F_n]$ is a supermartingale

1

There are 1 best solutions below

2
On BEST ANSWER

(1) is obvious. Think about it. write out the matrix $P$ and $f$ and multiply them. what is the expression for each term of the result?

I think for the row corresponding to $x$, it is $\sum_y P_{xy}f(y)$

Convince yourself, this is that the same as $E^xf(X_1)$.

(2) why dont you just check the martingale property the same way you check a simple symmetric random walk on Z is a martingale? Can you not see how part 1 is connected to this?