Let $-\infty<a<b<\infty$ and $f,g\in H^1(a,b)$. So, $f,f',g,g'\in L^2(a,b)$. Suppose
$$\int_a^b|f+g'|^2\mathrm dx=\int_a^b|f'-g|^2\mathrm dx=0.$$
Is it possible to conclude that $f=g=0$ a.e? If not, could you give me a counter example?
Thanks.
Let $-\infty<a<b<\infty$ and $f,g\in H^1(a,b)$. So, $f,f',g,g'\in L^2(a,b)$. Suppose
$$\int_a^b|f+g'|^2\mathrm dx=\int_a^b|f'-g|^2\mathrm dx=0.$$
Is it possible to conclude that $f=g=0$ a.e? If not, could you give me a counter example?
Thanks.
Copyright © 2021 JogjaFile Inc.
No. Take $f(t)=\sin t$, $g(t)=\cos t$.