I am stuck on this problem I have to do for my PDE class.
Consider heat diffusion in a 1D object between x=0 and x=L. It can be shown that the heat flux at the surface is $-k\nabla{}T*\hat{n}$ i.e. its is proportional to the normal derivative at the surface. In our case, we specify adiabatic boundary conditions as follows : $$u_x(0,t)= 0 $$ $$ u_x(L,t)=0$$ If the initial temperature is $f(x)$ solve for the temperature as a function of x and t.
To be honest, I don't really know where to start. So far I have tried to solve the diff. eq : $$ \frac{\partial u}{\partial t} =-k\nabla{}u*\hat{n}$$ using separation of variables but I got nowhere. Could anyone help me understand how to properly start this problem ? Thank you !
I will omit the derivation of the heat equation, but I believe you should have $$ \frac{\partial u}{\partial t} = k\Delta u. $$ The idea is that for any small region $V$ we have $$\frac{d}{dt} \int_V u = \int_{\partial V} -k\nabla u \cdot {\bf n}dS.$$ This equation is simply saying that the energy change in the region $V$ (LHS) is given by the flux through the boundary (RHS). Using the divergence theorem on the right hand side you can derive the heat equation.
Now, in 1D, on the interval $[0,L]$ we have simply
$$u_t = k u_{xx}.$$
Assume that $u(x,0)= f(x)$ and $u_x(0,t)=u_x(L,t)=0$. You can use separation of variables from here: assume that $u(x,t) = X(x)T(t)$ and see that
$$XT' = kX''T.$$
Rearranging we have
$$\frac{T'}{kT} = \frac{X''}{X} = -\lambda.$$
This gives us two ODEs:
$$T'+k\lambda T=0$$
and
$$X''+\lambda X=0.$$
Note the boundary values: $u_x(0,t) = X'(0)T(t)=0$ and $u_x(L,t)=X'(L)T(t)=0$.
If $T(t) =0 $ then $u\equiv 0$ and we only recover the trivial solution. So, we may assume that $X'(0)=X'(L)=0$. I believe the rest of the problem should be calculations from ODEs.