Consider the problem of a sphere of material that starts at a non-uniform temperature, $T = r^{2}$ and is covered with insulation on the outer surface so that no heat gets out. We take the coordinate $r$ to measure position radially from the centre of the sphere with the outer surface given at $ r = 1.$ We take t as time and the variable $ T (r, t)$ as the temperature. The equation governing the heat flow, is the heat equation $$\frac{\delta T}{\delta t}=\frac{1}{r^2}\frac{\delta}{\delta r}(r^2\frac{\delta T}{\delta r}), 0\leq r\leq 1, t>0$$
We insist that T remains finite as $ r → 0$ and the outer surface boundary condition is $$\frac{\delta T}{\delta t}(1,t)=0, t>0$$ with the initial condition $$T(r,0) = r^2, 0 < r < 1.$$
(a) Find the solution T (t, r) using separation of variables. Note: you can leave the answer in a form where eigenvalues are given by the roots of an equation. By exploiting the orthogonality of the eigenfunctions you should give the integral formulas necessary to compute the coefficients in the solutions. You do not need to evaluate the integrals.
(b) Find numerically, to five decimal places accuracy, the value of the smallest of the eigenvalues in (a), corresponding to the first non-constant term in the solution.
This is what I have obtained so far:
$$\frac{\delta T}{\delta t}=\frac{1}{r^2}\frac{\delta}{\delta r}(r^2\frac{\delta T}{\delta r})$$ $=\frac{\delta^2 T}{\delta r^2}+ \frac{2}{r} \frac{\delta T}{\delta r}$
Let $T=R(r)Q(t)$
$\frac{\delta T}{\delta t}= R(r)Q'(t)$
$\frac{\delta T}{\delta r}= R'(r)Q(t)$
$\frac{\delta^2 T}{\delta r^2}= R''(r)Q(t)$
Then putting back into the equation we get $$\frac{Q'(t)}{Q(t)}=\frac{R''(r)}{R(r)} + \frac{2}{r}\frac{R'(r)}{R(r)}= \mu$$ which gives the two equations: $$Q'(t)-\mu Q(t)=0$$ and $$rR''(r)+2R'(r)-\mu rR(r)=0$$
Then I have done
Let $o(r)=rR(r)$
$o'(r)=R(r)+rR'(r)$
$o''(r)=2R'(r)+rR''(r)$
Putting this back into the equation we get $o''(r)-\mu o(r)=0$
Now how do I proceed? using the boundary conditions how do I implement this into the equation.
From a basic ODE class you should know that the solution to $y'' = \lambda y$ is either ( $\lambda <0$) $$y = A \sin (\sqrt {-\lambda} t) + B \cos ( \sqrt {-\lambda} t ) $$ ( $\lambda =0$ ) $$y = A t + B $$ ( $\lambda >0$) $$ y = A \exp (\sqrt \lambda t) + B \exp ( -\sqrt \lambda t )$$ Use the boundary conditions to see what case you fall into, this will give you your eigenvalue equation.
Hint: What does the condition $$ \frac{ \partial T }{\partial t } (1,t) = 0 \quad t>0 $$ imply about $\mu$? Then use Fourier series with the initial condition to deduce the solution.