I would like to get some clarification on some terminology used in descriptions of Markov chains. I've seen some resources say that "a markov chain consists of a sequence of random variables" and other sources state that it "consists of a sequence of states".
Are states and random variables used interchangeably? If not, what is exactly is a state and random variable here?
The "state" of a Markov chain is a random variable that takes the values of any state in the sample space with some probability. For example, let's say a drunk man is trying to find his way off a football field and he can't tell his left from his right. Let's call his position on the field his "state" $X_n$ at any time $n\in\mathbb{N}$ and say he starts at the 50 yard line (i.e. $X_0=50$). Obviously his state can take any value ranging from $0$ to $100$ (i.e. $\Omega=\{0,1,...,100\}$), and we'll say he moves left or right at any step with probability $1/2$. In this way, his state at time step $n=1$ is the random variable $X_1$ that takes the value $X_1=49$ with probability $1/2$ and $X_1=51$ with probability $1/2$. The Markov chain is the sequence of his "states" $\{X_0,X_1,...,X_n\}$ where every state is a random variable that can take some value between $0$ and $100$. I hope this clarifies some of the terminology.