I have an n by n matrix, where the row represent a current state, and the columns a next state. Each cell represents a weight of how likely it is that I'll be going from the current state to the next when the state changes.
So, for instance, if I have two states A and B, I would have a 2d matrix:
\begin{bmatrix} 15 & 5 \\ 8 & 2 \end{bmatrix}
Going from state A to state A has a weight of 15, so is 3 times as likely as going from A to B (since that has a weight of only 5). When in state B (second row) there is 4 times as much change of going to state A, than there is of staying in state B (a weight of 8 vs 2).
Let's call this matrix $M$. Now my question: Is it correct that if I multiply this matrix with itself ($M*M$), I now have a matrix that represent the weights of being in a state after 2 state changes? And the matrix $M^3$ represents the weights after 3 state changes?
Posting as an answer at the OPs request, although all I did was know what to search for.
Wikipedia on Markov chains (https://en.wikipedia.org/wiki/Markov_chain) says
where $P$ is the transition matrix.