The well-known Markov Property is that $$P(X_n = i | X_{n-1} = k_1, \dots, X_{n-j} = k_n ) = P(X_n = i | X_{n-1} = k_1) $$
Suppose we lay out some stochastic model in the following transition probability graph

Given that there is a repeated, node, is it still possible to consider the process a Discrete Time Markov chain?
Whenever you right down a definition of your stochastic process as such a graph, you force it to be a Markov Chain (MC), at least if the graph is given its usual interpretation. But in that case you consider every node to be a separate state, so that the "1" below would be a state different from the top-left "1". We call that Labeled MCs: whereas an MC is given by a pair $(X,P)$ where $X$ is the state space and $P$ is a transition matrix, an LMC is a tuple $(X,P,Y,L)$ where $Y$ is a set of labels and $L:X\to Y$ is a labeling map. For example, in your case $X = (a,a',b,c)$, $P$ is given by your graph, $Y = (1,2,3)$ and $L:(a,a',b,c)\mapsto(1,1,2,3)$. It happens that $X$ is indeed an MC (by definition), that is it satisfies Markov property. It induces a stochastic process on $Y$, which in this case is not a Markov process: given that $Y = 1$ you have different probabilities depending on whether you know the whole history, or just the current label.