Let $\Pi$ a random Markov matrix with dimension $m \times m$, that is, its elements are $\Pi_{ij} \in (0,1)$ and its rows sum to one. The rows are independent random variables, i.e. we can write $$\Pi = \begin{bmatrix} \Pi_1 \\ \vdots \\ \Pi_m \end{bmatrix}$$ and each row vector $\Pi_i$ is taken at random with a Dirichlet distribution with concentration parameters $(\alpha_{i1},\dots,\alpha_{im})$. We know that for a nonrandom Markov matrix, if it has an invariant distribution, then \begin{equation} \lim_{n \to \infty} \Pi^n = 1\pi \end{equation} where $\pi$ is the invariant distribution and $1$ a column vector of ones. I am interested in the invariant distribution associated to my random Markov matrix. In particular, can we say something about $\mathbb{E}[\log(\pi^{(n)})]$ or $\mathbb{E}[\pi^{(n)}]$, for an element $n \in \{1,\dots,m\}$ of the invariant distribution?
Added
In the $2 \times 2$ case, if $$\Pi = \begin{bmatrix} 1-\Pi_1 & \Pi_1 \\ \Pi_2 & 1-\Pi_2 \end{bmatrix}$$ then $$ \pi = \begin{bmatrix} \frac{\Pi_2}{\Pi_1+\Pi_2} \\ \frac{\Pi_1}{\Pi_1+\Pi_2} \end{bmatrix}$$ and for instance $$\mathbb{E}[\log(\pi^{(1)})] = \mathbb{E}[\log(\Pi_2)] - \mathbb{E}[\log(\Pi_1+\Pi_2)]$$ and the solution follows since the marginal distributions are Beta and $\Pi_1$ and $\Pi_2$ are independent. How would this generalize to $m > 2$?