Discrete time process vs Markov chain.

55 Views Asked by At

Let $X_1,X_2,...$ be a discrete time process, which takes values in $S$ that is finite or countable. I want to prove that there exists a Markov chain $Y_1,Y_2...$ taking values in countable $\hat S$, and a function $f:\hat S \rightarrow S$ such that $f(Y_n)=X_n$ for any $n$.

How to prove this? It seems untrue as the discrete time process can depend on the past values, while the Markov process only depends on the last value.