Let $(X_n)_{n\ge 0}$ be a simple asymmetric random walk on states $0,1,\dots,M$, where $0$ and $M$ are absorbing. Initial state is $i\neq 0,M$. Let $(X_n^*)_{n\ge 0}$ be the process $(X_n)$ conditional on event $A:=\{X_m=M\text{ finally }\}$. That is $$P((X_0^*,X_1^*,\dots,X_k^*)\in B)=P((X_0,X_1,\dots,X_k)\in B|A)=\frac{P(\{(X_0,X_1,\dots,X_k)\in B\}\cap A)}{P(A)}$$ for any $k\ge 0 $ and $B\subset \{0,1,\dots,M\}^{k+1}$.
I need to show that $(X^*_n)$ is a Markov chain.
It's clear to me intuitively why it is so: on the event $A$ none of $X_k$'s should be equal 0, and then the Markov property is inherited from $(X_n)$, but I'm struggling with showing it rigorously.