This is a classical question, that has led to many a heated argument:
Should the symbol $\mathbb{N}$ stand for $0,1,2,3,\dots$ or $1,2,3,\dots$?
It is immediately obvious that the question is not quite well posed. This convention, as many others, are not carved in stone, and there is nothing to prevent mathematician $A$ define $\mathbb{N}$ to be the positive integers including $0$, and mathematician $B$ to define $\mathbb{N}$ to be the positive integers excluding $0$. It does not seem that one definition is accepted widely enough for it to be "the right definition", and even if this was the case, the fashion might change in the future.
I am, however, hoping that there might be a semi-mathematical reason to prefer one notion over the other. For example, I have spent much of my mathematical life believing that $0 \in \mathbb{N}$ because: 1) morally, $\mathbb{N}$ is the cardinalities of finite sets 2) the empty set is a set with $0$ elements. However, recently I realised that this reasoning applies to $\omega$ rather than $\mathbb{N}$, and - much to my horror - I saw $\omega$ and $\mathbb{N}$ used side by side with the only distinction being that $0 \in \omega$ while $0 \not \in \mathbb{N}$. For another example, $\mathbb{N}$ seems to be a much nicer semigroup if $0 \not \in \mathbb{N}$ (and in any case, adding $0$ to a semigroup is a more natural operation than removing it), which would an argument for taking $0 \not \in \mathbb{N}$.
The arguments mentioned above are, of course, rather weak, but perhaps just enough to tip the scale. In any case, this is the general type of argument I am looking for.
Question: Does there exists a convincing argument for deciding if $0 \in \mathbb{N}$ ?
(I consider it quite possible that the answer is negative because in some context one convention is preferable, and in other context the other one. The problem could be dismissed by using $\mathbb{Z}_+$ (or even $\mathbb{Z}_{>0}$ and $\mathbb{Z}_{\geq 0}$) to avoid confusion, but note that in some context one definitely does not want to do this. I would be interested in an argument that is universal in the sense that it makes overall mathematical landscape more elegant, and does not spoil any detail too much. I do not hope that the argument would be convincing to every mathematician, especially one working in a very specific and narrow area.)
Let me ask a similar question in rebuttal:
The answer is simple. It depends on the context. Sometimes it's easier to have finite sets included in the definition of "countable" and sometimes it's easier to have just the finite set, and have "at most countable" for the term which includes finite sets as well.
I will give an argument why $\Bbb N$ should include $0$, though.
One can consider $0$ and $1$ as the basic atoms of the numbers we know. $\Bbb N$ is the set generated by $0,1$ using addition; and then $\Bbb Z$ is generated by adding additive inverses, and $\Bbb Q$ and then $\Bbb R$ and $\Bbb C$.
Of course that if you take your atomic set of numbers to be $\Bbb C$ or something else, then it might as well be redundant, but it's still a reasonable argument. With only some naive set theory, and axioms for addition and multiplication, we can create all the numbers we need! That's an incredible thing. And all just form the assumption that $0$ and $1$ exist.
On the other hand, in analysis it's often more convenient to have $0\notin\Bbb N$. For example when we say that $x^n$ is well defined for every $x\in\Bbb R$ and $n\in\Bbb N$. Or if we often talk about the sequence $\frac1n$, then we find it easier to write $\frac1n$ for $n\in\Bbb N$, rather than adding "...and $n>0$".