Definition of Markov Chain (as it is stated in my textbook):
Let $S$ be a set of states and $\mathbb P = \{p_{i,j}\}$, $i,j \in S$ a transistion matrix . Then the sequence of RV's $(X_{n})_{n \ge 0}$ is a Markov chain, if $P(X_{n+1} = i_{n+1} \mid X_{n} = i_{n}, \ldots, X_0 = i_0) = p_{i_n, i_{n+1}}$ for $n \ge 0$ and $i_{n+1}, \ldots, i_0 \in S$.
Question:
Should $P(X_{n+1} = i_{n+1} \mid X_{n} = i_{n}, \ldots, X_0 = i_0) = p_{i_n, i_{n+1}}$ also hold if $i_{n+1}, \ldots, i_0 \in S$ are chosen such that $P(X_{n} = i_{n}, \ldots, X_0 = i_0) = 0$ which mean $$P(X_{n+1} = i_{n+1} \mid X_{n} = i_{n}, \ldots, X_0 = i_0) = \frac {P(X_{n+1} = i_{n+1},X_{n} = i_{n}, \ldots, X_0 = i_0)} {P(X_{n} = i_{n}, \ldots, X_0 = i_0)}$$ is undefined ? Is it implicit in the definition that such cases should be ignored, since they cannot occur ?
Also, is it implicit in the definition that $S$ is a countable or countable infinite set of real numbers or could the set be arbitrary ?