Let $(e_{t})_{t=0}^{\infty}$ be a sequence of $\mathbb{R}$ valued random variables defined on a common probability space $(\Omega, \Sigma, P)$; assume $(e_{t})_{t=0}^{\infty}$ is a Markov process with kernal $Q$ and initial distribution $\psi$.
We know the process $(e_{t})_{t=0}^{\infty}$ has a recursive representation. That is, there exists a measurable function $G\colon \mathbb{R}\times [0,1]\rightarrow \mathbb{R}$ such that the process defined by
\begin{equation} \tilde{e}_{t+1} = G(\tilde{e}_{t}, \eta_{t}) \end{equation}
has the same distribution as $(e_{t})_{t=0}^{\infty}$ on $\mathbb{R}$. Where $(\eta_{t})_{t=0}^{\infty}$ is a sequence of i.i.d random variables uniformly distributed on $[0,1]$.
My question is, can we define $(\eta_{t})_{t=0}^{\infty}$ on the probability space $(\Omega, \Sigma, P)$? That is if I start with $(\Omega, \Sigma, P)$, fix it, have a Markov process $(e_{t})_{t=0}^{\infty}$ on $(\Omega, \Sigma, P)$, can $(e_{t})_{t=0}^{\infty}$ be written recursively on $(\Omega, \Sigma, P)$? Or do we have to construct a new space for $(\eta_{t})_{t=0}^{\infty}$ to be defined on?
My initial solution: we know how to construct $G$. Let $F^{-1}_{x}$ be the inverse of $Q(x, (-\infty, \cdot))$ for each $x$ and let $G(x,y)\colon = F^{-1}_{x}(y)$. (see Borokov, Probability Theory, p508). So I thought about defining $\eta_{t}(\omega) \colon = F_{e_{t}(\omega)}(e_{t+1}(\omega))$. Clearly we have $e_{t+1} = G(e_{t}, \eta_{t})$. However I am lost to show that $(\eta_{t})_{t=0}^{\infty}$ is an i.i.d sequence on $(\Omega, \Sigma, P)$!
We first rephrase the question, then we exhibit a setting where $(\eta_t)$ is indeed independent, finally we show that, in general, $(\eta_t)$ may not be independent.
In the post below, "increasing" means strictly increasing.
1. Rephrasing the question. Assume without loss of generality that, for every $t$, $$e_{t+1}=h(e_t,z_t)$$ for some measurable function $h$ and some i.i.d. sequence $(z_t)_{t\geqslant0}$ uniform on $[0,1]$ and independent of $e_0$. Let $\bar z$ denote yet again another random variable uniform on $[0,1]$ and independent of everything else. Also assume, again without loss of generality, that for every $x$, the function $h(x,\cdot)$ is nondecreasing on $[0,1]$.
Then, for every $(x,y)$, $F_x(y)=P(e_{t+1}\leqslant y\mid e_t=x)=P(h(x,\bar z)\leqslant y)$ and the question is whether the sequence $\eta_t=F_{e_t}(e_{t+1})$ is independent.
2. In a specific case, the answer is positive. To show that indeed $(\eta_t)$ may be independent, consider the case when, for every $x$, the function $h(x,\cdot)$ is increasing on $[0,1]$.
Then each function $h(x,\cdot)$ is invertible hence $z_t$ is $\sigma(e_t,e_{t+1})$-measurable since $z_t=h(e_t,\cdot)^{-1}(e_{t+1})$. Thus, $$\eta_t=P(h(e_t,\bar z)\leqslant e_{t+1}\mid e_t,e_{t+1})=P(h(e_t,\bar z)\leqslant h(e_t,z_t)\mid e_t,z_t)=P(\bar z\leqslant z_t\mid e_t,z_t)=z_t$$ which proves that in this case $(\eta_t)$ is indeed independent.
3. But in the general case... One can only assume that for every $x$, the function $h(x,\cdot)$ is nondecreasing on $[0,1]$, and then things can turn differently. For a specific example, consider the case when $e_0=0$ and $$h(x,z)=(x+z)\,\mathbf1_{x\leqslant1}$$ Then $F_x(y)=P(x+\bar z\leqslant y)=P(\bar z\leqslant y-x)$ if $x\leqslant1$, and $F_x(y)=\mathbf 1_{y\geqslant0}$ if $x>1$. Thus, $\eta_t=z_t$ if $e_t\leqslant1$ and $\eta_t=1$ if $e_t>1$.
In particular, $[\eta_t=1]=[e_t>1]=[e_{t+1}=0]\subseteq[\eta_{t+1}=z_{t+1}]$ hence $[\eta_t=\eta_{t+1}=1]=\varnothing$ but one can check that, for every $t\geqslant2$, $P(\eta_t=1)>0$ since, again, $[\eta_t=1]=[e_t>1]$. Thus, $(\eta_t)$ is not independent.