I just tried to prove that the difference $Z(t) = A(t) - B(t)$ of two independent poisson processes $A$ and $B$ is still Markov.
First of all I prove that if N(t) is Markov then $N^*(t) = -N(t)$ is Markov.
$$ \mathbb{P}(N^*(t) = j \vert N^*(t_n) = i, ..., N^*(t_0) = i_0) =$$ $$ = \mathbb{P}(-N(t) = j \vert -N(t_n) = i, ..., -N(t_0) = i_0) = $$ $$ = \mathbb{P}(N(t) = -j \vert N(t_n) = -i, ..., N(t_0) = -i_0) = $$ $$ = \mathbb{P}(N(t) = -j \vert N(t_n) = -i) = \mathbb{P}(N^*(t) = j \vert N^*(t_n) = i)$$
Now I prove that if A(t) and B(t) are two independent Markov processes with independent increments, then $Z(t)$ is Markov. $$ \mathbb{P}(Z(t+1) = i_{t+1} \vert Z(t) = i_t, ..., Z(0) = i_0) = $$ $$ = \mathbb{P}(A(t+1) + B(t+1) = i_{t+1} \vert A(t) + B(t) = i_t, ..., A(0) + B(0) = i_0) = $$ $$ \sum_{j+k=i_{t+1}}\mathbb{P}(A(t+1) = j, B(t+1) = k \vert A(t) + B(t) = i_t, ..., A(0)+B(0) = i_0) = $$ $$ = \sum_{j+k=i_{t+1}}\mathbb{P}(A(t+1) = j)\vert A(t) + B(t) = i_t, ..., A(0)+B(0) = i_0)*$$ $$*\mathbb{P}(B(t+1) = k \vert A(t) + B(t) = i_t, ..., A(0)+B(0) = i_0) = $$ $$ = \sum_{j+k=i_{t+1}}\mathbb{P}(A(t+1) = j\vert A(t) + B(t) = i_t)*\mathbb{P}(B(t+1) = k \vert A(t) + B(t) = i_t, ) = $$ $$ = \sum_{j+k=i_{t+1}}\mathbb{P}(A(t+1)= j, B(t+1) = k \vert A(t) + B(t) = i_t) = $$ $$ =\mathbb{P}(A(t+1) + B(t+1) = i_{t+1} \vert A(t) + B(t) = i_t) = \mathbb{P}(Z(t+1) = i_{t+1} \vert Z(t) = i_t)$$
Should I be more detailed? Does the proof work well?
The proof is correct, with some small errors and a possible lack of explanation at some points. I don't know at which step you are unsure, so I will still explain why each step is correct.
First you started by proving that if $N$ is a Markov process, so is $-N$. For this , you picked $t > t_n > ... > t_0$. \begin{align*} &\mathbb{P}(N^*(t) = j \vert N^*(t_n) = i, ..., N^*(t_0) = i_0) \tag{Start point to show Markovian} \\ =& \mathbb{P}(-N(t) = j \vert -N(t_n) = i, ..., -N(t_0) = i_0) \tag{$N^* = -N$} \\ =& \mathbb{P}(N(t) = -j \vert N(t_n) = -i, ..., N(t_0) = -i_0) \tag{Obvious} \\ =& \mathbb{P}(N(t) = -j \vert N(t_n) = -i) \tag{$N$ is Markov}\\ =& \mathbb{P}(N^*(t) = j \vert N^*(t_n) = i) \tag{$N^* = -N$} \end{align*}
Now you proceed to show $Z$ is Markov. However, in your proof you took $t+1,t$ etc. as time points. You should stick with $t > t_k > ... > t_0$. \begin{align*} & \mathbb{P}(Z(t_{n+1}) = i_{n+1} \vert Z(t_n) = i_n, ..., Z(0) = i_0) \tag{Start point} \\ = &\mathbb{P}(A(t_{n+1}) + B(t_{n+1}) = i_{{n+1}} \vert A(t_n) + B(t_n) = i_n, ..., A(t_0) + B(t_0) = i_0) \tag{$Z = A+B$} \\ =& \sum_{j+k=i_{{n+1}}}\mathbb{P}(A(t_{n+1}) = j, B(t_{n+1}) = k \vert A(t_n) + B(t_n) = i_n, ..., A(t_0)+B(t_0) = i_0) \tag{Split into disjoint events} \\=&\sum_{j+k=i_{n+1}}\mathbb{P}(A(t_{n+1}) = j\vert A(t_n) + B(t_n) = i_n, ..., A(t_0)+B(t_0) = i_0)\\& \times \mathbb{P}(B(t_{n+1}) = k \vert A(t_n) + B(t_n) = i_n, ..., A(t_0)+B(t_0) = i_0) \tag{1}\\=& \sum_{j+k=i_{n+1}}\mathbb{P}(A(t_{n+1}) = j\vert A(t_n) + B(t_n) = i_n) \times\mathbb{P}(B(t_{n+1}) = k \vert A(t_n) + B(t_n) = i_n) \tag{2}\\ =& \sum_{j+k=i_{n+1}}\mathbb{P}(A(t_{n+1})= j, B(t_{n+1}) = k \vert A(t_n) + B(t_n) = i_n) \tag{1} \\=& \mathbb{P}(A(t_{n+1}) + B(t_{n+1}) = i_{n+1} \vert A(t_n) + B(t_n) = i_n) \tag{Recombine}\\=& \mathbb{P}(Z(t_{n+1}) = i_{n+1} \vert Z(t_n) = i_n) \tag{$Z = A+B$} \end{align*}
$(1)$ : Note that $A(t_{n+1}) , B(t_{n+1}) $ are independent, and therefore independent conditional on any set of random variables. That justifies two steps of the proof.
$(2)$ : This requires some explanation. The point is, $A(t_{n+1})$ and $\{A(t_{i}) : i \leq n-1\}$ are conditionally independent given $A(t_n)$. However, note that $A(t_{n+1})$ is independent of $B(t_i)$ anyway, and therefore is conditionally independent given $A(t_n)$ obviously.It follows that $A(t_{n+1})$ is conditionally independent of $\{A(t_i) , B(t_i) : i \leq n-1\}$ given $A(t_n)$. Use the conditional disjoint blocks theorem (easy to prove, just like the usual one) we get that $A(t_{n+1})$ is conditionally independent of $\{A(t_i) + B(t_i) : i \leq n-1\}$ given $A(t_n)$. This justifies the step $(2)$ ,since (under existence conditions) for any events $A,B,C$, we have $P(A|B,C) = P(A,C\vert B) / P(C\vert B) $ , which equals $ P(A \vert B)$ after the top breaks by conditional independence(you know which $A,B,C$ to use from context).
See if you used all properties of the Poisson process above.(You have not) If not, then you have got yourself a more general statement than the one you were trying to prove initially. You can attempt to write that down.
Also, the sum in the calculations will change to an integral in some more general case, but we can handle that in similar fashion , I assure you.