Let the sequence of random variables $\{X_{n}\}, n = 1,2, \ldots$ be a Markov chain, which is sufficiently "Ergodic" so that it has stationary distribution $\pi$ and for a function $f$ the sequence of random variables $$Y_{n} = \frac{1}{n}\sum\limits_{i=1}^{n}f(X_{n})$$ has the convergence property $Y_{n} \rightarrow \mathbb{E}_{\pi}[f(X)]$ w.p. 1.
Since each $Y_{n}$ is an average, I am wondering if one can define a slightly different average, that doesn't so explicitly use the iteration number $n$. For instance, suppose we defined $Y_{1} = f(X_{1})$ as before, but $$Y_{n+1} = \theta f(X_{n}) + (1-\theta)Y_{n}$$ for some fixed $\theta$. Are there situations where this would be true?
This is called a "moving average": $Y_n = \sum_{j=1}^{n-1} \theta (1-\theta)^{j-1} f(X_{n-j})$. You can't expect it to converge to a constant, because the dependence on the most recent $f(X_m)$ doesn't go to $0$. But if $0 < \theta < 1$ and things are nice, it ought to converge in distribution to a random variable.