For my stochastic process class, we have many definitions of a Markov Process. One of them is the following:
$P(X_{t+1} \in A | \sigma(X_1, ...,X_t)) = P(X_{t+1} \in A | \sigma(X_t))$.
(A measurable)
I have seen it stated that this is equivalent to
$E(f(X_{t+1}) | \sigma(X_1, ...,X_t)) = E(f(X_{t+1}) | \sigma(X_t))$.
for bounded, measurable f
I am having trouble seeing why this equivalence is true. Could someone shed some light on this?
Thanks!
The first equation follows from the second by taking $f=I_A$.
If the first equation holds then the second one holds for $f$ of the form $f=I_A$ with $A$ measurable. Hence it holds for all simple functions $f$. Any bounded measurable function is a uniform limit of simple functions. Also Bounded Convergence Theorem holds for conditional expectations. Hence the second equation holds for all bounded measurable functions $f$.