Solution of Poisson's equation using separation of variables

6.2k Views Asked by At

I am attempting to solve the following question for practice:

Poisson's equation using separation of variables

I know how to solve Laplace's equation using separation of variables. In this case, however, when I try a solution of the form $\Phi(r,\theta) = R(r)\Theta(\theta)$, for a single term of the RHS, I obtain the following:

$$\frac{\Theta}{r} \frac{\partial}{\partial r} (r \frac{\partial R}{\partial r}) + \frac{R}{r^2} \frac{\partial^2 \Theta}{\partial \theta^2} = \alpha_n r^n \cos(\theta)$$

It is not obvious to me how this equation can be separated so as to use the usual argument of separation of variables. I have noticed that $\Theta(\theta) = \cos(\theta)$ would be a solution for the angular part, but this may not be the general solution.

2

There are 2 best solutions below

1
On BEST ANSWER

You have solved Laplace's equation, which is the homogeneous version of this equation (i.e. this equation with $\rho=0$). You should have a general solution $\varphi_0$ that has two arbitrary constants. One of the many miracles of linearity is that you only need to find one solution $\varphi_p$ (often called the particular solution) of the Poisson's equation. Then $$ \varphi = \varphi_0 + \varphi_p,$$ with its requisite two arbitrary constants, will be a general solution to Poisson's equation, since the linear operator $L$ will just act on $\varphi$ like $$L\varphi = L(\varphi_0+\varphi_p) = L\varphi_0+L\varphi_p = 0 + L\varphi_p = \rho.$$

So all we need to do (assuming you've already done the hard work of understanding the general solution to Laplace's equation) is find a single $\varphi_p$ that solves it.

The form of $\rho$ suggests we should try the exact same form $$ \varphi_p = \sum_n b_n r^n\cos(\theta)$$ as an ansatz for the solution. Plugging in, we compute $$ L\varphi_p = \sum_n b_nr^{n-2}\cos(\theta)(n^2-1)$$ so we can solve by coefficient matching to the form for $\rho,$ noting that the $n=0$ term can't be there and that the $n=1$ is redundant with your general solution, so is zero anyway.

0
On

$\newcommand{\bbx}[1]{\,\bbox[15px,border:1px groove navy]{\displaystyle{#1}}\,} \newcommand{\braces}[1]{\left\lbrace\,{#1}\,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\,{#1}\,\right\rbrack} \newcommand{\dd}{\mathrm{d}} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,\mathrm{e}^{#1}\,} \newcommand{\ic}{\mathrm{i}} \newcommand{\mc}[1]{\mathcal{#1}} \newcommand{\mrm}[1]{\mathrm{#1}} \newcommand{\pars}[1]{\left(\,{#1}\,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,{#2}\,}\,} \newcommand{\totald}[3][]{\frac{\mathrm{d}^{#1} #2}{\mathrm{d} #3^{#1}}} \newcommand{\verts}[1]{\left\vert\,{#1}\,\right\vert}$

Write your solution as $\ds{\varphi\pars{r,\theta} = \mrm{f}\pars{r}\cos\pars{\theta}}$ where $\ds{\mrm{f}\pars{r} \equiv \sum_{n = 0}^{\infty}\beta_{n}r^{n}}$.

\begin{align} &{1 \over r}\,\partiald{}{r}\bracks{r\,\partiald{\mrm{f}\pars{r}}{r}} - {\mrm{f}\pars{r} \over r^{2}} = \sum_{n = 0}^{\infty}\alpha_{n}r^{n} \\[5mm] &\sum_{n = 0}^{\infty}\alpha_{n}r^{n} = \sum_{n = 0}^{\infty}\beta_{n}n^{2}r^{n - 2} - \sum_{n = 0}^{\infty}\beta_{n}r^{n - 2} = -\beta_{0}r^{-2} + \sum_{n = 2}^{\infty}\beta_{n}\pars{n^{2} - 1}r^{n - 2} \\[5mm] = &\ -\beta_{0}r^{-2} + \sum_{n = 0}^{\infty}\beta_{n + 2}\bracks{\pars{n + 2}^{2} - 1}r^{n} \implies \pars{~\beta_{0} = 0\,,\quad\left.\rule{0pt}{5mm}\beta_{n + 2}\right\vert_{\ n\ \geq\ 0} = {\alpha_{n} \over \pars{n + 2}^{2} - 1}~} \end{align}


$$ \bbx{\varphi\pars{r,\theta} = \pars{\beta_{1}r + \sum_{n = 2}^{\infty}{\alpha_{n - 2} \over n^{2} - 1}\,r^{n}} \cos\pars{\theta}} $$

You need one more boundary condition to determine $\ds{\beta_{1}}$.