Stationary distributions are left eigenvectors for transition matrix of a markov chain.
Does anyone has a good understanding of why stationary distribution is a left eigenvector ? I'm searching for a geometric evidence, and links between graph theory and linear algebra. I learned about eigenvectors as a vector of symetry, one that will remain fix. However, I can't yet plot a markov graph as vectors.
I have been trying to do change of basis on matrices and see what happens on the graph of transition but this gives me nothing.
Do you have any insight on that question?
good related question : connection between graphs and the eigenvectors of their matrix representation
Let $P$ be the $n \times n$ matrix of transition probabilities, and let $\mu = (\mu_1,\dots,\mu_n)$ be a probability distribution.
The key is to note that given an initial probability distribution $\mu$, the entries of $\mu^T P$ are the probabilities that we end up in each state after taking one step in the chain (given that we chose the initial state according to the distribution $\mu$). In particular, $$ \Bbb P(\text{land in state j}) = \sum_{i=1}^n \Bbb P(\text{start in state }i) \cdot \Bbb P(\text{transition from $i$ to $j$}) =\\ \sum_{i=1}^n \mu_i \cdot P_{ij} $$ which is indeed the $j$th entry of $\mu^TP$.
By definition, a stationary distribution is one that remains unchanged after taking one step (and therefore arbitrarily many steps) in the chain. That is, $\mu$ is a stationary distribution if and only if $\mu^TP = \mu$.