I'm studying statistical physics these days and have newly learnt the concept discrete Laplacian. For a finite graph $G$ with vertices $\Lambda\subset\mathbb{Z}^d,$ consider the discrete Laplacian defined as: $$(\Delta f)_x=\sum_{y\sim x}f_y-f_x.$$
It seems that there is a well-known result which states that:
If $\displaystyle\sum_{x\in\Lambda}f_x=0$, then the Laplace operator $\Delta$ is invertible.
However, I'm having some difficulty understanding why this result is true. Could anybody give me a proof of this results? Many thanks in advance for your help!
Saying whether $\Delta$ is invertible or not based on a condition on $f$ is a sort of type error, but what is true is the following:
The logic in the proof is that you want to:
Step 1 follows directly from the definition of the Laplacian: in the sum $$ \sum_{x \in \Lambda} (\Delta g)_x = \sum_{x \in \Lambda} \sum_{y \in x} (g_y - g_x) $$ every adjacent pair of vertices $\{x,y\}$ contributes $(g_y - g_x)$ to the sum when we consider $x$ first, and $(g_x - g_y)$ to the sum when we consider $y$ first. As a result, everything must cancel to $0$.
Step 2 is the hard part. It is where we use the necessary assumption that $G$ is connected; in general, the dimension of the kernel of $\Delta$ (more commonly phrased as the multiplicity of $0$ as an eigenvalue) is equal to the number of connected components.
One way to proceed is to consider the quadratic form $$g \cdot \Delta g = \sum_{x \in \Lambda} \sum_{y\sim x} g_x(g_y - g_x).$$ Every adjacent pair of vertices $\{x,y\}$ contributes $g_x(g_y - g_x) + g_y(g_x - g_y) = -(g_x - g_y)^2$ to the sum; therefore $g \cdot \Delta g = 0$ if and only if $g_x - g_y = 0$ for every adjacent pair $\{x,y\}$.
In a connected graph, this implies that $g$ is constant. Whenever $\Delta g =0$, we certainly also have $g \cdot \Delta g = 0$, so in this case $g$ must be constant as well.
As a side note: we saw above that $g \cdot \Delta g$ ends up a negative sum of squares here. For this reason, it is more common to see the Laplacian defined as $$(\Delta f)_x = \sum_{y \sim x} f_x - f_y$$ which produces a positive sum of squares in the same step.