Consider a probability space $(\Omega, \mathcal{A}, p)$, and let $\{f_t\}_{t\geq 0}$ be a sequence of random variables of the form $f_t: \Omega \longrightarrow E$, for some set $E$. Suppose that $\{f_t\}_{t\geq 0}$ is a Markov chain on $(\Omega, \mathcal{A}, p)$, and consider a function $g:E\longrightarrow S$. Under which conditions on $g$ the sequence $\{g\circ f_t\}_{t\geq 0}$ is a Markov chain on $(\Omega, \mathcal{A}, p)$?
I do not know much about probability theory, so my question might be obvious to other people. I would be interested in finding some sort of a proof, so that I can understand better. References are also welcome.