Let $(\mathcal S, d)$ be a metric space. let $\{f_\theta: \theta \in \Theta\} $ be a family of Lipschitz functions on $\mathcal S$ and $\mu$ be a probability distribution on $\Theta$. Suppose that $\int K_{\theta}\mu(d\theta)<\infty, \int d(f_{\theta}(x_0), x_0)\mu (d\theta)<\infty$ for some $x_0\in S$, and $\int \log K_{\theta}\mu(d\theta)<\infty $ Then the induced Markov chain has a unique stationary distribution. This is from here in page $47$, Theorem $1.1$ in the paper Iterated Random Functions
Now, I am not able to understand, $\theta,\Theta, d\theta, \mu(d\theta)$ here in this context. Could anyone help me to understand these four things by a little example maybe? That would so help, Thanks a lot.
For example, can I take $\Theta=\{1,2,3,4,5\}$? If yes, Then what will happen to the other three objects?
If you were to take $\Theta= \{1, 2, 3, 4, 5\}$ then $\theta$ would be any one of 1, 2, 3, 4, 5. That is, $\theta\in\Theta$ means that the variable $\theta$ can take on any of the values in the set $\Theta$. Since $\Theta$ is discrete, not continuous, "$d\theta$" would not be defined. If instead, we take $\Theta= (0, 1)$, the open interval from 0 to 1, then $\theta$ could be any number between 0 and 1 and $d\theta$ would be the usual differential from Calculus. I believe that "$\mu d\theta$" is the "Lebesgue measure" defined by the variable $\theta$ though I would have expected the notation "$d\mu(\theta)$.