I'm reading through Gelfand and Fomin's 'Calculus of Variations', and they've just derived the Hamilton-Jacobi Equation:
$$\frac{\partial S}{\partial x} + H \left(x, y_1, \ldots, y_n , \frac{\partial S}{\partial y_1},\ldots,\frac{\partial S}{\partial y_n}\right)=0$$
where $S=S(x, y_1, \ldots, y_n)$. They then go on to say that the complete integral of this equation depends on $n$ parameters:
$$S = S(x,y_1, \ldots, y_n, \alpha_1,\ldots,\alpha_n )$$
However, $S$ depends on $n+1$ independent variables $(x, y_1, \ldots, y_n)$, and the Hamilton-Jacobi equation includes partial derivatives with respect to all of them, so shouldn't the complete integral have $n+1$ parameters?
Notice that the Hamilton–Jacobi (HJ) eq. does not depend on the un-differentiated function $S$. Hence, if $S$ is a solution to the HJ eq., then adding any constant $\alpha_{n+1}$ would trivially be a new solution $S+\alpha_{n+1}$. Traditionally, one does not bother to include this trivial integration constant $\alpha_{n+1}$ in the tally of integration constants $\alpha_{1}, \ldots ,\alpha_{n}$.
References: