Differentiation with integration region depending on $x$ to solve for decreasing energy of wave equation

43 Views Asked by At

I want to show that for the general wave equation $u_{tt} - \nabla \cdot (c^2\nabla u) + qu = 0, \quad u(x, 0) = \phi(x), \quad u_t(x, 0) = \phi(x)$ we have

$$ E(t) = \int_{|x-x_0| < R_0 - c_2t} (u_t^2 + c(x)^2|\nabla u|^2 + q(x)u^2) \, dx $$ is non-increasing when $c, q \geq 0$ and $c_1 \leq c(x) \leq c_2$ for some $c_1, c_2 > 0$.

For $x \in \mathbb{R}$, I know by Leibniz's Theorem that $$\frac{d}{dt} \int_{a(t)}^{b(t)} f(x, t) \, dx = \int_{a(t)}^{b(t)} \frac{\partial f(x, t)}{\partial t} \, dx + b'(t)f(b(t), t) - a'(t)f(a(t), t)$$ from which I am able to prove the desired result. Is there an analogous version of this theorem which lets me take a derivative of the integral where the limits of integration as defined as functions of $t$?