The definition of the Grassmann algebra over $\mathbb{R}$ with $L$ generators is (from Rogers book on Supermanifolds)
My problem with this: we never say what these generators are. Do they exist even? For example, when we define the operations for the elements of $\mathbb{N}$, for example, we are in solid grounds, since the natural numbers can be defined using sets, in such a way that their existence is assured (if one accepts the axioms of set theory, of course). In this case, we are describing how the multiplication of generators work, but we have not defined the generators at all.
Maybe my question is a non-question somehow, but it is boggling me.


A similar example to Grassmann algebra is the polynomial. In abstract algebra, polynomials of a symbol $x$ (or dummy variable) over a field (e.g. $\mathbb{R}$ or $\mathbb{C}$) form a ring structure. See for example,
https://en.wikipedia.org/wiki/Polynomial_ring
In the polynomial ring, polynomials are not understood as functions of $x$, but expressions. The variable $x$ in a polynomial, e.g., $1-2x+x^3$ is not understood as a number but only as a symbol. The insistence on this point does not seem so necessary for polynomials, as you do get a number from the polynomial when you plug in a number to $x$. But this "symbolic" understanding of polynomials helps generalizing the concept of them to formal power series. See for example,
https://en.wikipedia.org/wiki/Formal_power_series
The infinite series of the symbol $x$ is only an expression. This saves the worry of convergence issues. One can write down e.g. $1+x+2!x^2+\cdots+n!x^n+\cdots$ as a formal power series with no problem, even though the convergence radius of the infinite series is $0$, which means the infinite series is not well defined for all $\,x\neq 0\,$ when understood as a function. Of course, there has to be stricter rules towards how to manipulate these formal expressions so that the result is always well defined.
It's a bit of digression to talk about formal power series. But the symbolic view of the variable $x$ in polynomials helps us understand the Grassmann algebra. Before that, let's think about a polynomial ring of two symbols $x$ and $y$. Even though we don't know what $x$ and $y$ represent, we do assume $xy=yx$ to deal with multiplications such as $(x+y)(x-y)=x^2-y^2$. Otherwise the $xy$ term and $yx$ term do not cancel. Even for one symbol $x$, we need to assume $xx^2=x^2x=x^3$. If the multiplication of the symbol is not associative, a polynomial of $x$ is not well-defined either.
As opposed to a multivariate polynomial ring, the Grassmann algebra makes a different assumption of how the multiplication between the symbols $x$ and $y$ works. Instead of assuming $xy=yx$, the multiplication between symbols (or generators) of the Grassmann algebra anticommute, i.e., $xy=-yx$. Not only that, a symbol also anticommutes with itself, i.e., $x^2=-x^2=0$. So the most general object of a Grassmann algebra with $2$ symbols $x$ and $y$ is given by the expression
$$a+bx+cy+dxy,$$
where $a,b,c,d$ are numbers in the field over which the Grassmann algebra is defined. They can be real numbers, complex numbers, etc. The symbols $x$ and $y$ do not belong to these fields. They are just the symbols. So multiplications between the coefficients $a,b,c,d$ commute as normal real or complex numbers do, while mutliplications between the symbols $xy=-yx$ anticommute. A multiplication of two objects in the Grassmann algebra is given by \begin{align} &(a_1+b_1x+c_1y+d_1xy)(a_2+b_2x+c_2y+d_2xy)\\ =a_1a_2+(a_1&b_2+a_2b_1)x+(a_1c_2+a_2c_1)y+(b_1c_2-b_2c_1+a_1d_2+a_2d_1)xy. \end{align} We can also have Grassmann algebra over quaternion fields. Then the coeffcients $\,a,b,c,d\,$ in general do not commute. One step at a time.