I am trying to understand isoparametric graph partitioning.
Specifically, we have a graph defined by a Laplacian Matrix $L=D-W$, where $W_{ij}$ is the weight of the edge from i to j and $D$ a diagonal matrix where $D_i$ is the degree (sum of weights) of vertex i.
We want to minimize the quantity $H = x^TLx$ subject to the constraint $x^Td = k$, where $d$ is a vector containing the degrees of each vertex
Since we have a constraint, we use a lagrange multiplier $\lambda$, take gradients with respect to $x$ and get $2Lx-\lambda d=0$.
At this point, the paper I am reading says that $\lambda$ is a constant. I don't see how this is true. By my understanding of Lagrange multipliers, $\Lambda = \frac{\partial (x^TLx) /\partial x_i}{\partial (x^TD) /\partial x_i}$ for each $x_i$. I see that the denominator is a scalar, but the numerator depends on all $x_i$'s.
Here is the paper: http://cns.bu.edu/~lgrady/grady2006isoperimetric_full.pdf
The relevant section is on page 7, right under equation 15.
The theory of Lagrange multipliers says: Under the given circumstances there is a value $\lambda\in{\mathbb R}$ such that at the conditionally stationary point $x$ the equations $$2Lx-\lambda d=0,\qquad x^Td=k\tag{1}$$ are simultaneously true. Now $(1)$ is a system of $N+1$ equations in $N+1$ unknowns, one of them $\lambda$. If the situation is not degenerate you should expect finitely many solutions of the form $(x_1,\ldots, x_N,\lambda)$, whereby you usually are not interested in the value of $\lambda$.