The Poisson process $\left(N_t\right)_{t\in\left[0,\infty\right)}$ is supposed to be a Markov process, but a Markov process $\left(X_t\right)_{t\in I}$ should be coupled with a family of distributions $\left(\mathrm{P}_e\right)_{e\in E}$ ($E$ being the process's state set) that satisfy the condition $$\forall e\in E, \mathrm{P}_e\left(X_0=e\right)=1$$ However, the Poisson process satisfies by definition $$N_0=0$$ It doesn't add up. I'd like to revise my textbook's definition of a Poisson process to make it a proper Markov process.
According to my textbook (Section 5.5, "The Poisson Process"), with $$\mathcal{I} := \left\{(a,b]:\space a,b\in[0,\infty), a\leq b\right\}$$ and with $\left(N_I,\space I\in\mathcal{I}\right)$ being a family of random variables with values in $\mathbb{N}_0$, $N_I$ to be interpreted as the number of occurrences during the time interval $I$, $\left(N_I,\space I\in\mathcal{I}\right)$ is a Poisson process iff it satisfies the following five axioms:
(A1) $N_{I∪J} = N_I + N_J$ if $I\cap J = \emptyset$ and $I\cup J\in\mathcal{I}$.
(A2) The distribution of $N_I$ depends only on the length of $I$: $P_{N_I} = P_{N_J}$ for all $I,J\in\mathcal{I}$ with $\ell\left(I\right) = \ell\left(J\right)$.
(A3) If $\mathcal{J}\subseteq\mathcal{I}$ with $I\cap J=\emptyset$ forall $I,J\in\mathcal{J}$ with $I\neq J$, then $\left(N_J, J\in\mathcal{J}\right)$ is an independent family.
(A4) For any $I\in\mathcal{I}$, we have $\mathrm{E}\left[N_I\right]<\infty$.
(A5) $\lim_{\varepsilon\downarrow0}\varepsilon^{−1}\mathrm{P}\left[N_{(0,\varepsilon]}\geq2\right] = 0$.
How can i revise these axioms to suit my purpose?
To expand on my comment, it does add up and there is nothing to revise.
Let us recall that the process $(N_I)_{I\in\mathcal I}$ is indexed by intervals, not real numbers, hence the notation $(N_t)_{t\geqslant0}$ is of your own making, in the context of the definitions you recall, and in particular $N_0$ simply does not exist. To recover a Markov process $(X_t)_{t\geqslant0}$, one should ask that, for every $t\gt0$, $$ X_t=X_0+N_{(0,t]}, $$ and it would probably be healthier to add the hypothesis that $X_0$ and $(N_I)_{I\in\mathcal I}$ are independent.
With this in mind, it is true that one often chooses $X_0=0$, in which case $X_t=N_{(0,t]}$ for every $t\geqslant0$, the case $t=0$ included.
To sum up, the process $(N_I)_{I\in\mathcal I}$ enumerates some events which occur in intervals $I$ of $(0,+\infty)$ hence it describes only the increments of a Markov process $(X_t)_{t\geqslant0}$. Either one considers that zero event happens at time $0$, then the number of events up to time $t$ is $X_t=N_{(0,t]}$ for every $t\geqslant0$. Or, $X_0$ events occur at time $0$, then the number of events up to time $t$ is $X_t=X_0+N_{(0,t]}$ for every $t\geqslant0$.