Source: This came from "Introduction to probability" by Charles Miller Grinstead, and James Aurie Snell. It was located on page 407 and is Theorem 11.1 in the section 11.1 Introduction.
Theorem: The Theorem 11.1 states: Let $P$ be the transition matrix of a Markov chain. The $ij$th entry $P_{ij}(n)$ of the matrix $P(n)$ gives the probability that the Markov chain, starting in the state $S_i$, will be in state $S_j$, after $n$ steps.
Proof: You start with the probability distribution $Pr(X_{k+n}=S_j | X_k = S_i)$ which can be written as a$\sum_h^\infty Pr(X_{k+n}=S_j | X_{k+n-1}=S_h)Pr(X_{k+n-1}=S_h | X_k= S_i)$. Which can be simplified down even further to the $\sum_1^n P_{jh}P_{hi}^{(n-1)}$. Which can then be simplified to the final version of $P_{ij}(n)$
Class: This is for a class called Seminar. It's a class were supposed to take the year we graduate. It is a class that we do a research essay that we then present to the class and do a set of problems every week with a new topic. (I.E. functions one week, inequalities next, then derivatives, and so on). It's supposed to be a capstone class for our degree that goes over a little bit of everything from the previous three years, but makes you think more and put things together. So we're mathematically mature, but we're not graduate level or anything. If that helps.
What I Need: I could use some help fleshing out this proof to a way that would be understandable for any college mathematics students. I'm doing it for a class and need to be able to teach anyone in my class to understand this proof, but statistics isn't my strong suit and any help would be appreciated.