Linear least squares involving linear functionals

19 Views Asked by At

Let $\Omega\subset\mathbb{R}^d$ be a domain. Suppose $H$ is the second order Frechét derivatives of a function $f\in L^2(\Omega)$, then $H\in (L^2\times L^2)^{*}$. Let $g\in (L^2(\Omega))^{*}$ and we want to solve for $v\in L^2(\Omega)$ such that $$ Hv=g. $$ I am wondering whether the linear least squares algorithm can be applied here, i.e. we update $v$ via a gradient descent $$ v \leftarrow v-\alpha(H^{T}Hv-H^{T}g). $$ I am seeking opinion on whether this will work. For instance, we are optimising the objective function $$ \min_{v\in L^2(\Omega)}\frac{1}{2}\|Hv-g\|^2_{(L^2(\Omega))^{*}} = \frac{1}{2}\left(\sup_{\|w\|=1}H(v,w)-f(w)\right)^2. $$ Does the gradient of this objective function look like the one that we get in linear least squares?