Optimisation problem for functional linear regression

27 Views Asked by At

Suppose that $Y$ is a zero mean random variable with values in $\mathbb R$ such that $\operatorname E|Y|^2<\infty$ and $X$ is a zero mean random element with values in $L^2[0,1]$ such that $\operatorname E\|X\|^2<\infty$, where $\|\cdot\|$ is the $L^2[0,1]$ norm. Both $Y$ and $X$ are defined on the same probability space $(\Omega,\mathcal F,P)$. Consider the problem of finding $\beta^*\in L^2[0,1]$ such that minimises $$ R(\beta) =\operatorname E|Y-\langle\beta,X\rangle|^2, $$ where $\langle\cdot,\cdot\rangle$ is the $L^2[0,1]$ inner product.

Does such $\beta^*$ exist? Is such $\beta^*$ unique? What kind of conditions does $\beta^*$ need to satisfy? Are these conditions analogous to the multivariate case?

In the multivariate case, i.e. when $X$ is a random vector with values in $\mathbb R^p$, we have that $$ R(\beta)=\operatorname E|Y-\beta'X|^2 =\operatorname EY^2-2\beta'C_{XY}+\beta'C_X\beta, $$ where $X=(\begin{array}{ccc}X_1&\ldots&X_p\end{array})'$, $\beta=(\begin{array}{ccc}\beta_1&\ldots&\beta_p\end{array})'$, $C_{XY}=\operatorname E[XY]$, and $C_X=\operatorname E[XX']$. Calculating the gradient $\nabla R$ and setting it equal to zero, we arrive at the condition $$ C_X\beta=C_{XY}. $$ If $C_X$ is invertible, we have that $\beta^*=C_X^{-1}C_{XY}$. I am trying to understand if the same line of thought could be used in the infinite dimensional case to arrive at an analogous condition $C_X\beta=C_{XY}$ (the operator $C_X$ would not be invertible in the infinite dimensional case).

This is related to functional regression and given as Exercise 4.6 in Kokoszka and Reimherr (2017).

Any help is much appreciated!