Regressing random functions

23 Views Asked by At

In the classical regression setting, we have a target function $f^*$ that we seek to estimate using an estimator $\hat{f}_D$ that is fitted by the data, $D$. There are many ways to generate $\hat{f}_D$ in different regression paradigms, with differing functional forms, but all of these treat $f^*$ as fixed. Most commonly, $\hat{f}_D$ is chosen to minimize MSE over the training and test sets. So for a training set $D \sim p(x)$ and test point $x_* \sim p(x)$ from the same distribution, $\hat{f} = \arg\min_f E_{D, x_*}[(f^*(x_*) - \hat{f}_D(x_*))^2]$.

My question is: what if we were to treat the target function, $f^*$, as random from some distribution $g(f)$? What is the optimal estimator $\hat{f} = \arg\min_f E_{f^*, D, x_*}[(f^*(x_*) - \hat{f}_D(x_*))^2]$ then, noting the expectation is now over the distribution $g(f)$? Does there exist any work/literature on the topic? What is known about this, in general?