This problem is in the book Introduction to Probability. The question goes this way.
Consider the process {$ X_n, n = 0,1,...$ } with values $0,1$ or $2$. If $$P\left\{X_{n+1} = j \mid X_n = i, X_{n-1} = i_{n-1},...,X_0 = i_0 \right\} = p_{ij}^e $$ if $n$ is even or $p_{ij}^o$ if $n$ is odd.
Here $\sum_{j=0}^2 p_{ij}^e = \sum_{j=0}^2 p_{ij}^o = 1, i = 0,1,2. $ Why is this not a Markov chain? And how can I transform it into a Markov chain?
The book probably defines Markov chains as homogenous Markov chains hence, if $p_{ij}^e\ne p^o_{ij}$ for at least some given $(i,j)$, then $(X_n)$ is not Markov in this sense, in general.
To recover a (homogenous) Markov chain, consider $Y_n=(X_n,U_n)$ where $U_n=o$ if $n$ is odd and $U_n=e$ if $n$ is even. Then the process $(Y_n)$ is a (homogenous) Markov chain on the state space $\{0,1,2\}\times\{o,e\}$ with transition probabilities $p_{ij}^o$ from $(i,o)$ to $(j,e)$, $p_{ij}^e$ from $(i,e)$ to $(j,o)$, and $0$ from $(i,o)$ to $(j,o)$ and from $(i,e)$ to $(j,e)$, for every $i$ and $j$ in $\{0,1,2\}$.