Q's: I suspect this is true: if two states in a markov chain communicate and one is recurrent, then the other is recurrent.
My approach is, lets say i and j are two states that communicate and as i is recurrent, this means you are guaranteed to revisit i everytime you reach it, so you have infinite points in discrete time which you reach i and as i communicates with j, this implies j is recurrent as we are always guaranteed to revisit j after reaching it.
My problem with this proof is it involves alot of words, so an alt.proof would be great. Thanks
You are decribing what's called a Regular Markov Chain. Every location has visitors and from every location you can visit another location. No visitor can leave the Chain and no outsider cen enter. This means that the transition matrix (there are different names for such a matrix) has columns that add up to 1. The fact is that the dominant eigenvalue of such a matrix is 1. That ought to be the first part of your proof. Then any particular distribution vector can be expressed as a Linear Combination of the eigenvectors of the matrix. In the long run (i.e. taking high powers of the matrix), Linear Algebra slhows us that the dominant eigenvalue prevails as the others "die out" for high powers. Thus, the entries of the corresponding eigenvector gives the ratio of the equilibrium. This is the proof in "words"