Let $X, X_{n,k}$ for $k,n\in\mathbb{N}$ denote independent random variables with values in $\mathbb{N}_0$. Define $N_0:=1$ and for $n\in\mathbb{N}$ set $$ N_n:=\begin{cases}0, & \text{ if }N_{n-1}=0\\X_{n,1}+\ldots+X_{n,N_{n-1}}, & \text{ if }N_{n-1}>0.\end{cases} $$ Prove or disprove that $(N_{n})_{n\in\mathbb{N}_0}$ is a Markov chain.
I have to check if the Markov property is fullfilled, i.e. if $$ P(N_{n+1}=a_{n+1}|N_n=a_n,\ldots,N_0=1)=P(N_{n+1}=a_{n+1}|N_n=a_n).~~~~~~~(*) $$
Edit
I would distinguish two cases, namely (1.) $a_{n+1}>0$ and (2.) $a_{n+1}=0$.
(1.) If $a_{n+1}>0$, this is only possible if $N_{n+1}=X_{n+1,1}+\cdots+X_{n+1,X_N}$ (and that this sum is $>0$), and this implies that $N_{n}>0$. And then the same again, i.e. $N_{n-2}>0$ etc.
So the conditional event on the LHS of $(*)$ is, that all $N_k, k\in\left\{n,...,1\right\}$ are of the form $N_k=X_{k,1}+\cdots+X_k,N_{k-1}$. So $(*)$ simply follows from the independence of all $X_{n,k}$.
(2.) If $a_{n+1}=0$, then it is either possible that $a_n=0$ or $a_{n}>0$. If $a_n>0$ then this implies (as in case 1) that all $N_k, k\in\left\{n,n-1,...,1\right\}$ are of the form $N_k=X_{k,1}+\ldots + X_{k,N_{k-1}}$. Then $N_{n+1}=0$ is independent of $N_k$, and so $(*)$ again follows by independence.
But I do not know how to handdle the case that $a_{n+1}=0$ and $a_{n}=0$.