Note
I've explicitly indicated it at points in this question, but unless stated otherwise $i,j,k \in \{1, \ldots, n\}$, $a,b,c \in \{1, \ldots, R_W\}$, and $\alpha, \beta, \gamma \in \{R_W + 1, \ldots, n \}$, where $n$ and $R_W$ will be defined below. If, for example, I write $q_i$ in the argument of a function, I mean that it can depend on all of the coordinates $q_1, \ldots, q_n$.
Setup
This question was motivated by a physics problem, but the part I'm stuck on is essentially pure math, so it seems like this is the right place to ask it. Suppose I have the Lagrangian $L(q_i , \dot{q}_i )$, where $i \in \{1, \ldots, n\}$. I define the momenta $$p_i \equiv \frac{\partial L}{\partial \dot{q}_i}$$ and the matrix $$W_{ij} = \frac{\partial p_j}{\partial \dot{q}_i} = \frac{\partial^2 L}{\partial \dot{q}_i \partial \dot{q}_j}$$ Without loss of generality, I assume the submatrix $W_{ab}$ with $a,b \in \{1, \ldots, R_W\}$ is the largest invertible submatrix of $W_{ij}$ with $i,j \in \{1, \ldots, n\}$ (i.e. $W$ has rank $R_W$, and we have reordered the $q_i$ and $\dot{q}_i$ so that the largest invertible submatrix is in the upper left corner). I then define the function $$h: \mathbb{R}^{R_W} \times \mathbb{R}^{n - R_W} \times \mathbb{R}^{n - R_W} \times \mathbb{R}^{R_W} \times \mathbb{R}^n \to \mathbb{R}^n$$ by $$h_i(\dot{q}_a , \tilde{p}_\alpha , \dot{q}_\beta , \tilde{p}_b , q_j ) = \tilde{p}_i - \frac{\partial L}{\partial \dot{q}_i} $$ ($a,b \in \{1, \ldots, R_W\}$, $\alpha , \beta \in \{R_W + 1 , \ldots, n\}$, and $i,j \in \{1, \ldots, n\}$). Note that in this definition, the $\tilde{p}_i$ are independent coordinates and not necessarily equal to the $p_i$ defined above. This function has Jacobian matrix $$ J = \begin{pmatrix} W_{ab} & 0 & A \\ B & I_{n - R_W} & C \end{pmatrix} $$ where $I_{n - R_W}$ is the $n - R_W \times n - R_W$ identity matrix. A straightforward computation shows that the matrix $K$ defined by $$ K = \begin{pmatrix} W_{ab} & 0 \\ B & I_{n - R_W} \end{pmatrix} $$ is invertible (I proved this by showing that the row vectors of the matrix are linearly independent). Since $\frac{\partial L}{\partial \dot{q}_i}$ is only a function of the $q_i$ and $\dot{q}_i$, it is clear that $h_i(\dot{q}_a , \tilde{p}_\alpha , \dot{q}_\beta , \tilde{p}_b , q_j ) = 0$ if we set $\tilde{p}_i = p_i (q_i, \dot{q}_i) = \frac{\partial L}{\partial \dot{q}_i} (q_i, \dot{q}_i )$. Thus, we can apply the implicit function theorem to conclude that there (locally) exist functions $f_a$ and $g_\alpha$ such that $$ h_i(f_a(\dot{q}_\beta , \tilde{p}_b , q_j) , g_\alpha (\dot{q}_\beta , \tilde{p}_b , q_j) , \dot{q}_\beta , \tilde{p}_b , q_j ) = 0 $$ More precisely, $\tilde{p}_i = \frac{\partial L}{\partial \dot{q}_i}$ if and only if $\dot{q}_a = f_a (\dot{q}_\beta, \tilde{p}_b , q_j)$ and $\tilde{p}_\alpha = g_\alpha (\dot{q}_\beta, \tilde{p}_b, q_j)$. In other words, requiring $\tilde{p}_i = p_i = \frac{\partial L}{\partial \dot{q}_i}$ imposes the constraints $\dot{q}_a = f_a (\dot{q}_\beta, \tilde{p}_b , q_j)$ and $\tilde{p}_\alpha = g_\alpha (\dot{q}_\beta, \tilde{p}_b, q_j)$.
The Question
The reference I'm following now claims something that must be true (there would be a big problem with the theory of constrained Hamiltonian systems if not), but that I can't figure out how to prove. It claims that the $g_\alpha$ do not in fact depend on the $\dot{q}_\beta$. The reference gives the following argument
But the rhs [of the equation $\tilde{p}_\alpha = g_\alpha (\dot{q}_\beta, \tilde{p}_b, q_j)$] cannot depend on the velocities $\dot{q}_\beta$, since otherwise we could express still more velocities [(the $\dot{q}_i$)] from the set $\{\dot{q}_\beta\}$ in terms of the coordinates [(the $q_i$)], the momenta [(the $p_i$)] and the remaining velocities [(the $\dot{q}_a$)], which is not possible.
I don't find this argument particularly compelling. My question is: can someone provide a proof of the claim that we have $\tilde{p}_\alpha = g_\alpha (\tilde{p}_b , q_j)$ when $\tilde{p_i} = \frac{\partial L}{\partial \dot{q}_i}$.