Given an undirected graph $\mathcal{G} = (V, E)$, if we consider a signal $\bf{x} \in \mathbb{R}^n$ ($x_i =$ value at node $i$-th) and a filter $\bf{g} \in \mathbb{R}^n$ then we can define a notion of graph convolution by
\begin{equation} \bf{x} *_{\mathcal{G}} \bf{g} := \bf{U(U^Tx \, \, \odot U^Tg)} \end{equation}
where $\odot$ denotes the element-wise product. The thing that I don't understand is, that if we take a filter $\bf{g_\theta = diag(U^Tg)}$, then the above equation becomes
\begin{equation} \bf{x} *_{\mathcal{G}} \bf{g_{\theta} = Ug_{\theta}U^Tx} \end{equation}
I guess this follows from some properties of the Hadamard product that I'm ignoring at the moment..
EDIT: adding infos..
$U$ is the matrix of eigenvectors obtained by eigendecomposing the (eventually normalized) Laplacian:
\begin{equation} L = U \Lambda U^T \end{equation}
where is defined as $L:= I_n - D^{-1}A$, with $A$ and $D$ the adjacency and degree matrices, respectively.
Notice that your graph convolution is given by:
$$ U g(\Lambda) U^\top x = U diag(g(\lambda))U^\top x$$
where $g(\Lambda)$ is a diagonal matrix, containing the vector $g(\lambda)$, called sometimes graph frequency response. It turns out that for graph convolutions $g(\lambda)= \Psi g$, where $\Psi$ is a Vandermonde matrix containing powers of the eigenvalues vector $\lambda^k$ in each column and $g$ is the vector containing the $K$ graph filter coefficients. The element-wise multiplication you describe is:
$$U( g(\lambda) \odot U^\top x)$$