Given $d$-regular graph $G=(V, E)$, the adjacency matrix $A$, and the normalized Laplacian $L=I-\dfrac{1}{d}A$, prove that for any vector $\vec{x}\in \mathbf{R^{|V|}}$: $$ \vec{x}L\vec{x}^t=\dfrac{1}{d}\sum_{(u,v)\in E}(x_u-x_v)^2 $$
$u$ and $v$ are vertices, and $x_u$ and $x_v$ are values of the vector $\vec{x}$ corresponding to the vertices.
I don't know where to start.
Say we have a graph with $N$ vertices whose connections are defined by the adjacency matrix $A$. Namely, vertex $i$ is connected to vertex $j$ if $A_{ij} =1$ and if $i$ and $j$ are not connected then $A_{ij}=0$. Let $D$ be the diagonal matrix defined as $D = \operatorname{diag}(\operatorname{deg}(A)_1, \operatorname{deg}(A)_2, \ldots,\operatorname{deg}(A)_N )$, where $\operatorname{deg}(A)_k$ is the degree of vertex $k$. By definition, this degree is $\operatorname{deg}(A)_k \equiv \sum_{\ell=1}^{N} A_{k\ell}$. For a $d$-regular graph, we have $\deg(A)_k = d$ for all $k$, but we will work most generally here.
We define the Laplacian as $L \equiv D - A$. Defining the row vector $\textbf{x}^{T} = (x_1, x_2, \ldots, x_N)$, we then have \begin{align} \textbf{x} L \textbf{x}^{T} & = \sum_{i, j=1}^{N} x_{i} L_{ij} x_{j} = \sum_{i, j=1}^{N} x_{i} (\delta_{ij} \operatorname{deg}(A)_{j} - A_{ij}) x_{j}\\ & = \sum_{i, j=1}^{N} x_{i} \delta_{ij} \sum_{\ell=1}^NA_{j\ell}x_{j} - \sum_{i, j=1}^N x_{i}A_{ij} x_{j}\\ & = \sum_{i, \ell=1}^{N} x_{i}A_{i\ell}x_{i} - \sum_{i, \ell=1}^N x_{i}A_{i\ell} x_{\ell}\\ & = \sum_{i, \ell=1}^{N} A_{i\ell}(x_i^2 - x_i x_{\ell})\\ \end{align} In the second equality, we used the Kronecker delta $\delta_{ij}$. In the first term of the fourth equality, we summed over the $j$ index and in the second term of the fourth equality, we switched the dummy summation variable $j$ with $\ell$.
Now, because $A$ is the adjacency matrix for an unweighted graph, we have $A = A^T$. And thus $\sum_{i, \ell =1}^N A_{i\ell} x_i^2 = \sum_{i, \ell = 1}^N A_{i \ell} x_{\ell}^2$. Therefore, we have \begin{align} \textbf{x} L \textbf{x}^{T} & = \frac{1}{2} \sum_{i, \ell = 1}^NA_{i \ell}(x_i^2 + x_{\ell}^2 - 2 x_{i} x_{\ell})\\ & = \frac{1}{2} \sum_{i, \ell = 1}^NA_{i \ell}(x_i- x_{\ell})^2. \end{align} With $A$ the adjacency matrix, $A_{i\ell}$ is only non-zero when the connection $(i, \ell)$ is in the set of edges $E$. So we obtain
\begin{align} \textbf{x} L \textbf{x}^{T} = \frac{1}{2} \sum_{ (i, \ell) \in E}(x_i- x_{\ell})^2. \end{align}