Is the integral over a convex mapping of the solution to the heat equation monotonic?

97 Views Asked by At

I have the following problem:

Suppose $u = u(t,x)$ solves the Cauchy-Problem for the (one-dimensional) heat equation: $$\frac{\partial u}{\partial t} = \frac{\partial^2 u}{\partial x^2}$$ $$u(0,x) = g(x) $$ and satisifes $\int_{\mathbb{R}} u(t,x) \,\text{d}x = 0$ for all $t \geq 0$. Furthermore, define for a convex function $\phi$ satisfying $\phi(0) = 0$ the following: $$F(t) := \int_{\mathbb{R}} \phi (u(t,x)) \,\text{d}x $$ The goal is to show that $F$ is monotonic (I suppose here that it is decreasing).

What I have tried so far:

The key to this problem seems to be Jensen's inequality, since it relates integrals and convex functions. Therefore, define $T>t$ and apply the inequality:

$$ F(T) - F(t) \geq \phi\left(\int_{\mathbb{R}} u(T,x) \,\text{d}x\right) - F(t) = - F(t) $$

The minuend being zero by assumtions on $u$ and $\phi$. I don't know how to follow from here. I have also tried representing $u$ with the heat kernel, i.e.: $$ u(t,x) = \int_{\mathbb{R}} \frac{1}{\sqrt{4\pi t}} \exp{\left\{{\frac{-(x-y)^2}{4t}}\right\}} g(y) \,\text{d}y$$

but I am not able to follow from there either. Any help is appreciated!

2

There are 2 best solutions below

1
On

only a partial answer

If $\phi$ is twice differentiable, you can use $\lim_{x\to\pm\infty}\partial_xu(t,x)=0$ and $\lim_{x\to\pm\infty}u(t,x)=0$ to show it by partial integration $$F'(t)=\bigg[\partial_xu(t,x)\phi'(u(t,x))\bigg]_{x=-\infty}^\infty-\int_{\Bbb R}\phi''(u(t,x))\big(\partial_xu(t,x)\big)^2\Bbb dx\leq0. $$

0
On

So I have come to the following, which I think is a proof:

Firstly, I used the generalized solution formula for any starting time $t_0$ (so not only $0$) and value at time $t_0$: $u(t_0,x) = u_{t_0}(x)$. For any time $t>t_0$ we get:

$$ u(x,t) = \int_{\mathbb{R}} u_{t_0} \Phi(t-t_0, x-y) \,\text{d}y $$ where $\Phi(x,t)= \frac{1}{\sqrt{4\pi t}} \exp{\left\{{\frac{-x^2}{4t}}\right\}}$. This formula is the same as above for $t_0 = 0$. Crucially, however, we may use it to show the monotonicity, by treating the smaller value as starting value. Let $t>t_0$ arbitrary, then we get: $$ \begin{align} F(t) = &\int_{\mathbb{R}} \phi(u(t,x)) \,\text{d}x \\= &\int_{\mathbb{R}} \phi\left(\int_{\mathbb{R}} u_{t_0}(x-y) \Phi(t-t_0,y) \,\text{d}y\right) \,\text{d}x \quad \text{(Convolution)}\\ \leq &\int_{\mathbb{R}} \int_{\mathbb{R}} \phi\left(u_0(x-y)\right) \Phi(t-t_0,y) \,\text{d}y\,\text{d}x \quad \text{(Jensen)}\\ = &\int_{\mathbb{R}} \int_{\mathbb{R}} \phi\left(u_0(y)\right) \Phi(t-t_0,x-y) \,\text{d}y\,\text{d}x \quad \text{(Convolution)}\\ = &\int_{\mathbb{R}} \phi\left(u_0(y)\right) \int_{\mathbb{R}} \Phi(t-t_0,x-y) \,\text{d}x\,\text{d}y \quad \text{(Fubini)}\\ = &\int_{\mathbb{R}} \phi\left(u_0(y)\right) \,\text{d}y = F(t_0) \end{align} $$

We may use Jensen in the second step, since $\Phi(t,x)$ integrates over $x$ to $1$, therefore formally, $\Phi(t,x) \,\text{d}y$ is a probability mass.