Solving a heat equation with time dependent boundary conditions

300 Views Asked by At

Problem statement

Solve the following PDE for $ u(x,t) (0<x<\ell, t>0)$ $$ u_t=ku_{xx} +bu_x+cu $$ With initial and boundary conditions as follows, $$u(x,0)=f(x), u(0,t)=g_0(t), u(\ell,t)=g_1(t)$$.

My attempt

Taking a Fourier transform, \begin{align*} \mathcal{F}(u_t)&=\mathcal{F}(ku_{xx}+b_u{x}+cu)=k\mathcal F(u_{xx})+b\mathcal F(u_x)+c\mathcal F(u)\\ \frac{d}{dt}\hat{u}(\omega,t )&=-k\omega^2 \hat{u}(\omega, t)+ib\omega \hat{u}(\omega,t)+\hat{u}(\omega,t) \end{align*} This is an ODE with the following general solution \begin{align*} \hat{u}(\omega, t)&=a(\omega)e^{(-k\omega^2+ib\omega+1)t} \end{align*} We know $ \hat{u}(\omega, 0)=a(\omega)=\mathcal F(f(x))=\hat{f}(\omega)$. Now we use this and take an inverse Fourier transform. \begin{align*} \hat{u}(\omega,t)&=\hat{f}(\omega)e^{(-k\omega^2+ib\omega+1)t}\\ u(x,t)&=\mathcal{F}^{-1}\left[\hat{f}(\omega)e^{(-k\omega^2+ib\omega+1)t} \right](x,t)\\ &=\mathcal{F}^{-1}\left[\hat{f}(\omega)e^{(-k\omega^2+1)t}\right](x+bt,t)\\ &=\frac{1}{2 \pi} \left(f(x) * \sqrt{\frac{\pi }{k t}}e^{1-\frac{x^2}{4kt}} \right)(x+bt,t)\\ u(x,t)&=\frac{1}{\sqrt{4\pi kt}}\int_{-\infty}^\infty f(y)e^{1-\frac{(x+bt-y)^2}{4kt}}\, dy \end{align*}

Question

I am unsure how to proceed from the last step above. Mainly, I don't know how to implement the boundary conditions.
1

There are 1 best solutions below

0
On

Unfortunately I don't have much time so the answer is short. By the linearity of the heat equation you can set $u(x,t)=v(x,t)+w(x,t)$. Then consider the following problems: \begin{equation} v_t=kv_{xx} +bv_x+cv, \qquad 0<x<\ell, t>0, \\ \qquad v(0,t)=g_{0}(t), v(x,0)=f(x) \end{equation} and \begin{equation} w_t=kw_{xx} +bw_x+cw, \qquad 0<x<\ell, t>0, \\ \qquad w(l,t)=g_{1}(t), w(x,0)=0 \end{equation} Once defined the Gauss's kernel \begin{equation} A(x,t)=\frac{1}{\sqrt{4\pi kt}} e^{1-\frac{(x+bt)^2}{4kt}} \end{equation} You looked for the solution of the heat equation with initial temperature distribution $f(x)$ \begin{equation} u(x,t)=\int_{-\infty}^\infty A(x-y)f(y) dy \end{equation} It should be possible to prove that the solution to the equation with variable $v(x,t)$ is given by \begin{equation} v(x,t)=\int_{0}^\infty F(x,y,t)f(y) dy \end{equation} with $F(x,y,t)=A(x-y,t)-A(x+y,t)$. This can be done by making an odd extension of $f(x)$ to $-\infty<x<0$, thus solving the initial value problem for the extended $f_{odd}(x)$ given by \begin{equation} f_{odd}(x)= \begin{cases} f(x) & \quad 0 < x <\ell \\ -f(-x) & \quad -\ell <x < 0 \\ 0 & \quad x=0 \end{cases} \end{equation} The general method here is called the method of images.