It seems like the binomial distribution and a markov chain where you move "up 1" with prob P and "down 1" with prob (1-P) are quite related.
Does anyone know the formal relationship?
To be more specific if you have n trials in a binomial distribution (equivalent to n states in the markov chain), the markov state distribution appears to be the same as the binomial distribution.

The Markov chain most closely linked to the binomial distribution is one where the particle moves up with probability $p$ and stays in place with probability $1-p$. Then after $n$ steps of this chain started at $0$, the distribution attained is precisely the Binomial$(n,p)$ distribution.
If $X_n$ denotes the state of this chain (started from $0$) after n steps, then the state of your chain (started from $0$) after $n$ steps is $2X_n-n$.