Estimate of Expectation of Estimated Function

44 Views Asked by At

Let $X_1,X_2,...$ be a sequence of random values distributed uniformly in $[0,1]$. By the law of large numbers, we have for any integrable function $f=f(x)$ (convergence in probability to an expectation): $$ \frac{1}{M}\sum_{m=1}^{M} f(X_m) \overset{p}{\underset{M\to\infty}{\longrightarrow}} \int_0^1 f(x)\, dx\, . $$ From this viewpoint, the left-hand sum with $X_1,X_2,...$ replaced by their realizations $x_1,x_2,...$ can be treated as a Monte-Carlo approximation of the right-hand integral: $$ \frac{1}{M}\sum_{m=1}^{M} f(x_m) \approx \int_0^1 f(x)\, dx\, . \tag{1}\label{eq1} $$ Now we assume that for any $x\in[0,1]$ the exact value of $f(x)$ cannot be computed directly but can be estimated statistically. More specifically, let us suppose that there are integrable functions $f_N=f_N(y_1,...,y_N, x) : N=1,2,...$ bounded by a same constant such that $$ f_N(Y_1,...,Y_N, x) \overset{p}{\underset{N\to\infty}{\longrightarrow}} f(x)\, , $$ where $Y_1,Y_2,...$ is another sequence of random values distributed uniformly in $[0,1]$. Thus, we have $$ \frac{1}{M}\sum_{m=1}^{M} f_N(Y_1,...,Y_N, X_m) \overset{p}{\underset{N\to\infty}{\longrightarrow}}\frac{1}{M}\sum_{m=1}^{M} f(X_m) \overset{p}{\underset{M\to\infty}{\longrightarrow}} \int_0^1 f(x)\, dx\, . $$ So the question is:

Is it legit to use $x_1,x_2,...$ as a random sample from $Y_1,Y_2,...$ to estimate the values of $f(x_1),f(x_2),...$ in \eqref{eq1}? It seems that we need $$ \frac{1}{M}\sum_{m=1}^{M} f_M(X_1,...,X_M, X_m) \overset{p}{\underset{M\to\infty}{\longrightarrow}} \int_0^1 f(x)\, dx\, $$ but I cannot find appropriate formulations for the law of large numbers to justify such a limit.