Orthogonality in Sobolev $H^1$.

121 Views Asked by At

Consider $H^1(-1,1)$ Sobolev space with natural dot product: $\langle f, g\rangle_{H^1} = \langle f, g\rangle_{L^2} + \langle f', g'\rangle_{L^2}$.

Let $U$ be the space of functions: $f(x) = 0$ $\forall x \le 0$. We want to know $U^\perp$.

The first thing is $U^\perp = \{f: f(x) = 0, x > 0\}$. But I guess the same trick (consider a neighborhood of zero) as for $C[-1,1]$ with $\langle f, g\rangle = \int_{-1}^1 f(x) g(x) dx$ doesn't work.

But I guess the latter family of functions obviously orthogonal to the $U$. But maybe there is something else? I've tried so:

$$ \int_{0}^{t} f g + \int_{0}^{t} f'g' = \int_{0}^t f g + f(t)g'(t) - \int_0^t f(x) g''(x) dx=0 \iff$$ $$f(t) g(t) + f'(t) g'(t) + f(t) g''(t) - f(t)g''(t) + f(0)g''(0) = 0 $$ Hence we obtain something like: $f(t) = \bar{C}\exp\left(-\displaystyle\int \dfrac{g + c}{g'} dt\right)$. But here we assume that there is exists $g''$ (which might be not true).

Any ideas?

2

There are 2 best solutions below

0
On

By definition, $f$ is in the orthogonal of $U$ iff for every $\varphi\in U$ $$ ∫_0^1 f\,\varphi + \int_0^1 f'\,\varphi' = 0. $$ If $f'$ is continuous and $f'(1)=0$, then $$ ∫_0^1 (f-f'')\,\varphi = f'(0)\,\varphi(0)-f'(1)\,\varphi(1) = 0 $$ since $\varphi\in H^1 \implies \varphi\in C^0$ and so $\varphi\in U\implies \varphi(0)=0$, and this means that $$ f - f'' = 0 \ \text{ in } [0,1] $$ in the weak sense. Solutions of this equation are of the form $f(t) =a\,e^t+b\,e^{-t}$ on $[0,1]$, and are all in $H^1(-1,1)$ iff they are continuous at $t=0$ and $f'∈ L^2(-1,1)$. So I think $$ U^{\perp} = \{f∈ H^1(-1,1):\exists(a,b)\in\Bbb R^2,∀ t≥0,f(t) = \,a\,e^t + b\,e^{-t}\} $$ but one should work more precisely with the boundary conditions to have a rigorous proof.

0
On

We claim that $U^\perp$ consists of functions in $H^1$ such that for some $\alpha \in \mathbb{R}$ we have $$ g(x) = \alpha\left(e^x + e^{2-x}\right), \quad x > 0. $$

Indeed, let $g \in H^1$ such that $\langle f, g \rangle_{H^1} = 0$ for any $f \in U.$ For $s \in \left]0, 1\right]$, consider the function $f_s$ defined by $$ f_s(x) = \left\{ \begin{matrix} 0 &\text{ if } &x \leqslant 0, \\ x / s &\text{ if } &0 \leqslant x \leqslant s \\ 1 &\text{ if } &s \leqslant x \leqslant 1. \end{matrix} \right. $$ Then obviously $f_s \in U$, and since $\langle f_s, g \rangle_{H^1} = 0$ we have \begin{equation} \label{eq:1} \frac{1}{s}\int_0^s x g(x) \mathrm{d}x + \int_s^1 g(x) \mathrm{d}x + \frac{1}{s}\int_0^s g'(x)\mathrm{d}x = 0. \end{equation} Multiplying by $s$ and differentiating with respect to $s$ we get $$ \int_s^1g(x) \mathrm{d}x + g'(s) = 0, \quad s > 0. \quad \quad (*) $$ In particular $g$ is of class $C^2$ and $g'' - g = 0$. Thus, for some $\alpha, \beta \in \mathrm{R}$, we have $g(s) = \alpha \exp(s) + \beta \exp(-s)$ for every $s \in \left]0, 1\right]$. By $(*)$ we have $g'(1) = 0$ and thus $\alpha e - \beta /e = 0$. Thus we obtain that for some $\alpha \in \mathrm{R}$ one has $$ g(s) = \alpha\left(e^s + e^{2-s}\right), \quad s > 0. \quad \quad (**) $$

Reciprocally, we see by integrating by part that if $g$ satisfies $(**)$, then $g \in U^\perp$. This concludes the proof of the claim.