We have that: $$ \mu = E_{(\mathbf{x},y)\sim f(x,y)} \left[(Y - g(\mathbf{X}))^2 \right] $$
Where the function g(.) tries to estimate Y from $\mathbf{X}$. There is a true function that maps Y from $\mathbf{X}$. An example of such function is $Y = \mu$ and $g_{true}(\mathbf{X}) = \frac{1}{N}\sum_{k=1}^N X_k$, where $\mu$ is the mean of a Gaussian distribution and $X_i$ is a realization of this pdf (supposing $\sigma = 1$) and $f_M(\mu)=\mathcal{U}(0,2)$, i.e $f(x,y)= f(x|y)f(y) = \mathcal{N}(\mu,1)\frac{1}{2}$. So to benchmark this estimator function we use N realizations of $Y$ and $\mathbf{X}$:
$$ \frac{1}{N}\sum_{i=1}^N \left(Y_{i} -g(\mathbf{X}_i) \right)^2 \to E[(Y_i-g(\mathbf{X}_i))^2] $$
As $N \to \infty$ in probability from the law of large numbers. Is it true? If not, how can I assess an estimative of $E[(Y_i-g(\mathbf{X}_i))^2]$?
In general, whenever you can guarantee apriori that the expectation $\mu$ is finite, you can use the Law of Large Numbers (LLN) to arrive at the desired result.
Here $X_i$'s are iid from Normal$(\mu, 1)$ where $\mu$ itself follows Uniform$(0, 2).$ We can set $Y = \mu$ and say that given $Y = y,$ $X_i$'s are iid from $N(y, 1),$ and so we estimate $y$ using the sample mean, $$g(\mathbf{X}) = \overline{X} = \frac{1}{n}\sum_{i=1}^n X_i,$$ and we have $$\mathbb{E}[(Y-g(\mathbf{X}))^2\mid Y] = \mathbb{E}(\overline{X} - y)^2 = \text{Var}(\overline{X}) = 1/n.$$ Now you can use the formula of total expectation to directly calculate $$\mathbb{E}(Y - g(\mathbf{X}))^2 =\mathbb{E}\left[ \mathbb{E}(Y - g(\mathbf{X}))^2\Big| Y\right] = \mathbb{E}\Big[\frac{1}{n}\Big] = \frac{1}{n}.$$