Let $I=[0,1]$, $\varepsilon>0$ and $(X_n)$ of i.i.d. random variables (copies of X, with $supp(X)\subseteq (-\varepsilon,\varepsilon)$ and $\mathbb E(X)>0$). Now, for each $\lambda>0$, we define the markov proces $Y^\lambda$ as:
\begin{equation} Y^\lambda_n = \begin{cases}Y^\lambda_{n-1}+\lambda X_n,\text{ if } Y_{n-1}+\lambda X_n\in[0,1],\\ 1,\text{ if } Y^\lambda_{n-1}+\lambda X_n>1,\\ 0, \text{ if } Y^\lambda_{n-1}+\lambda X_n<0. \end{cases} \end{equation} and $Y_0=0$ almost surely. More informally, that is, we consider the random walk in the interval $I$ with random variable $X$, but if the random walk surpasses the boundaries, it stays in the boundary.
$\textbf{Questions}$:
- Are the markov process $Y^\lambda$ Ergodic? That is, do these markov chains have an stationary distribution $\pi^\lambda$.
- If so, does it hold $\lim_{\lambda\rightarrow0}\pi^\lambda([0,1))=0$?
Intuition:
- I think that the first point should follow from Krylov-Bogolioubov theorem (https://en.wikipedia.org/wiki/Krylov%E2%80%93Bogolyubov_theorem).
- We can consider the following sequence of Markov chains:
\begin{equation} \overline{Y}^\lambda_n = \begin{cases}\overline{Y}_{n-1}+\lambda X_n,\text{ if } \overline{Y}^\lambda_{n-1}+\lambda X_n\in[0,1],\\ 1, \text{ if } \overline{Y}^\lambda_{n-1}+\lambda X_n>1. \end{cases} \end{equation} Since the support of $X$ is bounded, we have that $\lim_{\lambda\rightarrow0^+}\Pr[|Y^\lambda_n-\overline{Y}^ \lambda_n|>\delta]\rightarrow 0$ for all $\delta>0$. Moreover, since the expectancy is positive, we have that $\{\overline{Y}^\lambda_n\}$ has a stationary distribution $\overline{\pi}^\lambda$. My intuition here, is that with sufficient good conditions we should have that $\overline{\pi}^\lambda$ and $\pi^\lambda$ converge to the same distribution. And we can reduce the problem to $\{\overline{Y}^ \lambda_n\}$.