A question related to Wave Equation

287 Views Asked by At

Let $L>0$. Suppose $f, g$ are $C^2$ functions on $\mathbb{R}$ such that $$f(t)+f(-t)+\int_{-t}^t g(s)\,ds=0$$ and $$f(L+t)+f(L-t)+\int_{L-t}^{L+t} g(s)\,ds=0$$ for all $t\in \mathbb{R}.$ Does it follow that $f, g$ are odd periodic functions of period $2L$?

3

There are 3 best solutions below

2
On BEST ANSWER

Lemma. If $o(t)$ is odd and $e(t)$ is even and $o(t)=e(t)$ for all $t$, then $o(t)=e(t)=0$ for all $t$.

To prove, simply substitute $-t$ for $t$ in the equation and get $-o(t)=e(t)$, so $-o(t)=o(t)$ and $o(t)=e(t)=0$ for all $t$.

Now in our first equation $f(t)+f(-t)$ is even and $-\int_{-t}^t g(s)\,ds$ is odd. So by Lemma we have $f(t)+f(-t)=0$ for all $t$ and $f$ is odd.

In the second equation $f(L-t)+f(L+t)$ is even and $-\int_{L-t}^{L+t} g(s)\,ds$ is odd. So by Lemma, $f(L-t)=-f(L+t)$. Substituting $L+t$ for $t$, we get $f(-t)=-f(2L+t)$. Since we have proved that $f(-t)=-f(t)$, we get $f(t)=f(2L+t)$.

Now the first equation becomes $\int_{-t}^t g(s)\,ds=0$. Differentiating we get $g(t)+g(-t)=0$,i.e. $g$ is odd. The second equation becomes $\int_{L-t}^{L+t} g(s)\,ds=0$. Differentiating we get $g(L+t)+g(L-t)=0$, so $g$ is of period $2L$, as we did for $f$.

3
On

I think so.

What you've written is the D'Alembert formula for the wave equation in 1 dimension on the line, specifying that $u(0,t) = u_t(0,t) = u(L,t) = u_t(L,t) = 0$ for all times $t$.

The generic Fourier mode to this equation is $e^{i\xi (x -ct)}$, and the representation of the solution in terms of the Fourier inversion formula (in the sense of tempered distributions if needed) is basically that you are integrating over all frequencies $\xi$. But in this case, there can be no contribution from any mode except where $\xi$ is $\frac{n\pi}{L}$ because of your imposed conditions, and indeed only the sine portion of that term. This basically gives you the conclusion you sought for $f$ and $g$.

I gather from the negative vote that I was too vague. I will try again with a less hand-wavy method, without reference to the Fourier theory.

You can consider that in general, on the line, the solution to the wave equation is the superposition of a wave travelling to the right and a wave travelling to the left. So if we set $c=1$, we have $$u(x,t) = F(x+t) + G(x-t) = \frac{1}{2} \left(f(x+t) + f(x-t) + \int_{-t}^t g(s) ds \right)$$ where $f(x) = F(x) + G(x)$ and $g(x) = F'(x) - G'(x)$ (for the details of this transformation, see any textbook on elementary PDE, or the Wikipedia article on d'Alembert's formula).

The conditions translate to a statement that $F(t) + G(-t)= 0$, and that $F(L+t) + G(L-t) = 0$. Plugging the first equality into the second, we get $F(L+t) = F(-L+t)$, which is precisely the statement that $F$ is periodic of period $2L$. If we perform the substitution with respect to $G$ instead, we get that $G(L-t) = G(-L-t)$ for all values $t$, so $G$ is periodic of period $2L$ as well. Since $F$ and $G$ are $2L$ periodic, so are $f$ and $g$.

To see that $f$ and $g$ are odd, consider what happens when take a look at $$f(-x) = F(-x) + G(-x) = -G(x) + (-F(x)) = - f(x)$$ This uses only the first condition above. Similarly, differentiate the first condition to get that $F'(t) - G'(-t) = 0$, and then substitute to get $$g(-x) = F'(-x) - G'(-x) = G'(x) - F'(x) = - g(x)$$

1
On

There is an obvious counterexample.

When $f(t)=g(t)=0$ , they obviously satisfly $f(t)+f(-t)+\int_{-t}^tg(s)~ds=0$ and $f(L+t)+f(L-t)+\int_{L-t}^{L+t}g(s)~ds=0$ , but $f(t)$ and $g(t)$ should not belongs to odd periodic functions of period $2L$ .