Suppose that a finite-state Markov chain with transition matrix $M$ has multiple stationary distributions (or invariant measures). Let $\Pi$ denote the set of the stationary distributions.
Fix any $\pi \in \Pi$. Does there exist a Markov chain with transition matrix $M'$ with the following properties?
- $M'$ is "close" to $M$.
- $M'$ has a unique stationary distribution $\pi'$.
- $\pi'$ is "close" to $\pi$.
Here, "close" means that each element of difference $\vert M' - M \vert$ is sufficiently close to zero; each element of difference $\vert \pi' - \pi \vert$ is sufficiently close to zero.
In other words, is it possible to select arbitrary one stationary distribution $\pi \in \Pi$ by slightly perturbing the Markov chain $M$?
How about countable- or uncountable-state Markov chains?
Here is a refinement on Ian's answer that shows the desired thing can be done for any stationary distribution of the original chain (possibly being a convex combination of the distributions associated with each irreducible class). Fix $\epsilon$ such that $0<\epsilon<1$. We can make the error $O(\epsilon)$.
Consider a discrete time Markov chain with a finite state space $S$, transition probability matrix $P$, and let $\pi = (\pi_i)_{i \in S}$ be a particular stationary distribution, so that $\pi = \pi P$. Suppose this stationary distribution is not unique. Let $T$ be the set of transient states. Suppose there are $K$ irreducible classes (we know $K\geq 2$ since the stationary distribution is not unique). Let $A_1, ..., A_K$ be the states associated with each irreducible class (the $A_1, ..., A_K$ sets are disjoint).
Each class $k \in \{1, ..., K\}$ has a stationary distribution $\pi_k= (\pi_i^k)_{i \in S}$ such that $\pi_i^k = 0$ whenever $i \notin A_k$, and $\pi_i^k>0$ if $i \in A_k$. Further, $\pi$ is a stationary distribution of $P$ if and only if it can be written as a convex combination: $$ \pi = \sum_{k=1}^K \theta_k \pi_k $$ for some nonnegative values $\theta_1, ..., \theta_K$ that sum to 1.
Case 1: $\theta_k>0$ for all $k \in \{1, ..., K\}$. Then $\pi$ is such that $\pi_i >0$ for all $i \notin T$. Define a square matrix $R$ by $$ R = [\pi; \pi ; ... ; \pi ]$$ Define a modified transition probability matrix:
$$ \tilde{P} = (1-\epsilon) P + \epsilon R $$ It is easy to see that this is a valid transition probability matrix, in that all rows are nonnegative and sum to 1. Further, $\tilde{P}$ is an $O(\epsilon)$ approximation of $P$. Since $\pi_i>0$ for all $i \notin T$, it follows from the $R$ matrix that all states not in $T$ are recurrent under the modified Markov chain with $\tilde{P}$, so this modified Markov chain has 1 irreducible class and hence it has a unique stationary distribution. In fact that unique distribution is exactly $\pi$ since: $$ \pi \tilde{P} = (1-\epsilon) \pi P + \epsilon \pi R = (1-\epsilon)\pi + \epsilon \pi = \pi $$ Thus, we can perturb the transition probabilities of the original Markov chain by $O(\epsilon)$ to get a new Markov chain with a unique stationary distribution exactly equal to the desired $\pi$. $\Box$
Case 2: $\theta_k=0$ for some $k \in \{1, ..., K\}$. Let $z$ be the number of indices $k \in \{1, ..., K\}$ such that $\theta_k=0$. Define: $$ \tilde{\theta}_k = \left\{ \begin{array}{ll} (1-\epsilon)\theta_k &\mbox{ if $\theta_k>0$} \\ \epsilon/z & \mbox{ if $\theta_k=0$} \end{array} \right.$$ Then $\tilde{\theta}_1, ..., \tilde{\theta}_k$ are strictly positive and sum to 1. Define $$ \tilde{\pi} = \sum_{k=1}^K \tilde{\theta}_k \pi_k $$ Then $\tilde{\pi}$ is an $O(\epsilon)$ approximation of $\pi$. Also, $\tilde{\pi}_i>0$ for all $i \notin T$, and $\tilde{\pi}$ is a stationary distribution of the original Markov chain with transition probability matrix $P$. So we can use the method in Case 1 to construct a matrix $\tilde{P}$ that is an $O(\epsilon)$ approximation of $P$ and that has a unique stationary distribution equal to $\tilde{\pi}$. Specifically, $$ \tilde{P} = (1-\epsilon)P + \epsilon[\tilde{\pi}; \tilde{\pi}; ... ; \tilde{\pi}]$$ $\Box$