EDIT: Solved! It turns out that if the function is continuous and various regularity conditions hold then the statement is true. This has been established in the 'stochastic approximation' literature, see Robbins Monroe and others.
I was wondering if anyone knows whether a particular result has been proven or is indeed true. My problem is as follows.
Suppose I have a stationary, ergodic Markov chain that follows a process of the form:
$ \theta_{t+1} = \theta_t + \alpha F(\theta_t ; \epsilon) $
Where $F(.)$ is some function, $\epsilon$ is a vector of exogenous random variables, and $\alpha \leq 1$ is a constant ($\theta$ is a vector in case that's relevant).
Now, I'm concerned with what happens to the stationary distribution as I reduce $\alpha$ towards zero. I suspect the following:
If there exists a unique value $\theta*$ such that the expectation of $F(\theta* ; \epsilon) $ equals zero, then as $\alpha$ goes to zero, the stationary distribution will converge to a point at $\theta*$.
Does anyone know if this is true? Or if it is true in certain circumstances? Simulation on my particular problem suggests it may be the case. Any help greatly appreciated.