Consider a Markov chain with discrete time and discrete state space, with transition matrix $M$.
A probability distribution on states $\pi$ is said to be an invariant distribution when $\pi M=\pi$. It is clear that the set of invariant distributions is a convex set for linear combinations $\alpha \pi_1 + (1-\alpha)\pi_2$ with $0\leq\alpha\leq 1$.
It seems to me that using the decrease of relative entropy to stationary distributions, it is also possible to show that the probabilities $P_{\alpha}=\frac{1}{c(\alpha)}\pi_1^\alpha \pi_2^{(1-\alpha)}$ with $0\leq \alpha \leq 1$ is also stationary ($\pi^\alpha$ means that the probability of each stat is raised to the power $\alpha$). Here is my reasoning. We know that if $\pi$ is stationary, for any probability $p$ $$ K(pM,\pi)\leq K(p,\pi), $$ where $K$ is the Kullback Leibler divergence. Let $\pi_1$ and $\pi_2$ be stationary. Given $d\leq K(\pi_2,\pi_1)$ chose $p$ such that $K(p,\pi_1)=d$ and such that $K(p,\pi_2)$ is minimal. From the fact that $K(pM,\pi_i)\leq K(p,\pi_i)$ and the concavity of $K$, i think we can deduce that $p$ is stationary. Some calculation also lead to the fact that $p=\frac{1}{c(\alpha)}\pi_1^\alpha \pi_2^{(1-\alpha)}$ for some $0\leq \alpha \leq 1$.
In term of information geometry this would mean that the set of stationary distributions is both $m-$convex and $e-$convex.
I didn't find a reference explaining that, is it correct?
Thank you