Given a sequence of random variables $\{X_n\}_{n \geq 0}$ defined as follows.
$X_0 = p, X_{n+1} = qX_n + (1-q)1_{Y_{n+1} \leq X_n}, \forall n \geq 0, $ where $p, q \in (0, 1)$ are constants and $Y_i \sim U(0, 1)$ is a sequence of i.i.d random variables with standard uniform distribution for all $i \geq 0$. Define $X_{\infty} = \lim_{n \to \infty}X_n$.
I am confused about how to determine the distribution of $X_{\infty}$.
The limiting distribution is a Bernoulli with mean $p$. Assuming that we can show that the convergence $X_n \to X_\infty$ holds in $L^2$ (this can be proved using martingales), here is a sketch of a proof:
An alternative proof which does not require $L^2$ convergence. Again the idea is to show that $\mathbb{E}[X_\infty (1-X_\infty)] = 0$ and to continue from there. To show this, you can