My notes define a Markov chain in the following way:
A Markov chain is a stochastic process $\{X_n\}_{n=0}^\infty$ with a state space $\mathcal{S}$ that is at most countable and that satisfies the Markov-property:
For every $n \geq1$ and $(x_0, \dots, x_{n+1})\in \mathcal{S}^{n+2}$ with $\mathbb{P}(X_0 = x_0, \dots, X_n = x_n) > 0$ we have
$$\mathbb{P}(X_{n+1} = x_{n+1}\mid X_n = x_n, \dots, X_0 = x_0) = \mathbb{P}(X_{n+1} = x_{n+1}\mid X_n = x_n)$$
Moreover, the probabilities $\mathbb{P}(X_{n+1} = y \mid X_n = x)$ are stationary for all $x, y \in \mathcal{S}$ with $\mathbb{P}(X_n = x) > 0$: they do not depend on $n$ (= for all $n \geq 0$ the probabilities give the same result).
Question:
My question concerns the stationary part of the definition. Does this definition imply the following equivalence? (to make all conditional probabilities exist)
$$\exists n \geq 0: \mathbb{P}(X_n = x) > 0 \iff \forall n \geq 0: \mathbb{P}(X_n = x) >0$$
My PhD is in discrete Markov chains, and I've never heard this use of terminology. What you have is a time homogeneous Markov chain. What this means is the following.
The standard use of the term "stationarity" is very different. A distribution $\pi$ (on $S$) is called a stationary measure (my preferred term is "invariant distribution"; it's also known as an "equilibrium distribution") if the distribution of $X_n$ is $\pi$ when $X_0 \sim \pi$. So if a time-homogeneous chain has matrix $P$, then $\pi P = \pi$. It's called "equilibrium" distribution, since under certain (fairly weak) conditions one can show that, regardless of $X_0$, we have $P(X_n = j \mid X_0 = i) \to \pi(j)$ as $n \to \infty$.
Note that "$\exists n : P(X_n = x) > 0$" certain doesn't not imply that $P(X_n = x) > 0$ for all $n$. For example, take the simple Markov chain that has two vertices in the state space and always moves to vertex 1: $$ P = \begin{pmatrix} 1 & 0 \\ 1 & 0 \end{pmatrix}. $$ If you start at the second vertex, or let $X_0$ be some measure that is not a point mass at $1$, then the claim is violated: $P(X_0 = 2) > 0$ but $P(X_n = 1) = 1$ for all $n \ge 1$.
An excellent book on Markov chains is by James Norris. It really is the go-to reference, in my opinion. Grimmett also has some good stuff. (Just search for their names and "Markov chains" on your favoured search engine, aka Google.)