How to justify differentiating an asymptotic series in WKB method

979 Views Asked by At

Given a second-order linear ordinary differential equation, \begin{equation} \epsilon^2 \frac{d^2y}{dx^2} = Q(x) y(x), \tag{1}\label{1} \end{equation} where $\epsilon$ is regarded as a small positive number, a typical explanation of the WKB method (e.g. Ch. 10 of Bender and Orszag, Advanced Mathematical Methods for Scientists and Engineers, 1978) starts with writing $y(x)$ as \begin{equation} y(x) = \exp\left(\frac{1}{\delta}S(x,\delta)\right), \tag{2}\label{2} \end{equation} and assuming that $S(x,\delta)$ has an asymptotic series in $\delta$, \begin{align} S(x,\delta) \sim& \sum_{n=0}^{\infty} \delta^n S_n(x) & \text{ as } \delta\rightarrow&0+. \tag{3}\label{3} \end{align} Then, by substituting eqs. (\ref{2}) and (\ref{3}) into eq. (\ref{1}) and dividing both sides by $\exp(S/\delta)$, it is claimed that \begin{align} \frac{\epsilon^2}{\delta^2} \left[\sum_{n=0}^{\infty} \delta^n \frac{d S_n}{dx}\right]^2 +\frac{\epsilon^2}{\delta} \sum_{n=0}^{\infty} \delta^n \frac{d^2 S_n}{dx^2} \sim& Q(x) & \text{ as } \delta\rightarrow&0+. \tag{4}\label{4} \end{align} From here, the argument proceeds that we can set $\delta = \epsilon$ from dominant balance, and that we can equate the like powers of $\epsilon$ on both sides to yield differential equations for the coefficient functions $\{S_n(x)\}$ as \begin{align} \left(\frac{dS_0}{dx}\right)^2 =& Q(x), \tag{5}\label{5}\\ 2 \frac{dS_0}{dx}\frac{dS_1}{dx}+\frac{d^2S_0}{dx^2}=&0, \tag{6}\label{6}\\ \dots \end{align} and these equations are solved one after another.

Here is my question. How is it justified to differentiate the asymptotic series in eq. (\ref{3}) to derive eq. (\ref{4})?

I wondered that maybe we can use the fact that $y(x)$ is a solution of eq. (\ref{1}) in some way, but I don't find a way so far.

I read that the derivatives of both sides of an asymptotic relation does not always construct another asymptotic relation, and I guess that even if \begin{align} f(x,\epsilon)\sim& g_0(x) + g_1(x) \epsilon + g_2(x) \epsilon^2 +\cdots& \text{ as } \epsilon\rightarrow& 0, \tag{7}\label{7} \end{align} uniformly in some domain of $x$, it does not necessarily follow that \begin{align} \frac{df}{dx}(x,\epsilon) \sim& \frac{dg_0}{dx}(x) + \frac{dg_1}{dx}(x)\epsilon + \frac{dg_2}{dx}(x) \epsilon^2 +\cdots& \text{ as }\epsilon \rightarrow& 0. \tag{8}\label{8} \end{align} Or, does it follow (under some conditions)?

2

There are 2 best solutions below

1
On

In the context of asymptotic series, the `$\sim$'-symbol means the following:

Fix $N \in \mathbb{N}$. There exists a $\delta_0 > 0$ such that for all positive $\delta < \delta_0$, we have the equality \begin{equation} S(x,\delta) = \sum_{n=0}^N s_n(x) \delta^n + \mathcal{O}(\delta^{N+1}), \tag{*} \end{equation} which is uniformly valid (i.e. for all $x$ in the domain in question).

If we can find a domain for $x$ and a $\delta_0>0$ such that the above holds for all $N\in \mathbb{N}$, then we use the notation \begin{equation} S(x,\delta) \sim \sum_{n=0}^\infty s_n(x) \delta^n. \end{equation}

Note that this not mean that the series converges in the classical sense. Moreover, we can add a term like $e^{-x/\delta}$ to the original function without changing the series approximation at any order. For more information on asymptotic series, see e.g.

F. Verhulst, Methods and Applications of Singular Perturbations, Springer, 2005.

0
On

I am answering to my own question while waiting for more answers and comments.

To keep assumption minimum, I think the right thing to assume is that $\partial S/\partial x$, rather than $S$ itself, has an asymptotic expansion. Then, we are allowed to integrate this asymptotic expansion term by term, yielding the asymptotic expansion of $S$. Furthermore, we can justify differentiating the asymptotic expansion of $\partial S/\partial x$ term by term to yield the asymptotic expansion of $\partial^2 S/\partial x^2$, which we need for solving the differential equation, using the Ricatti equation to be satisfied by $\partial S/\partial x$.

So, let me explain this idea in more detail. As wrote in the original question, we solve the ordinary differential equation, \begin{equation} \epsilon^2 \frac{d^2y}{dx^2} = Q(x) y(x). \tag{1}\label{1a} \end{equation} Note that the solution $y$ depends on the parameter $\epsilon$ as well as the independent variable $x$ of the differential equation. Let us transform the dependent variable from $y$ to $S$ by \begin{equation} y(x,\epsilon) = \exp\left(\frac{1}{\epsilon}S(x,\epsilon)\right). \tag{2}\label{2a} \end{equation} By substituting eq. (\ref{2a}) into eq. (\ref{1a}), we obtain the differential equation to be satisfied by $S$ as \begin{equation} \left[\frac{\partial S}{\partial x}\right]^2 +\epsilon \frac{\partial^2 S}{\partial x^2} = Q(x). \tag{3}\label{3a} \end{equation} By introducing a notation, \begin{equation} u(x,\epsilon) = \frac{\partial S}{\partial x}(x,\epsilon), \tag{4}\label{4a} \end{equation} eq. (\ref{3a}) can be written as \begin{equation} \epsilon \frac{\partial u}{\partial x} + u^2 = Q(x). \tag{5}\label{5a} \end{equation} This is the Ricatti equation for $u$.

Now let us make an assumption that $u(x,\epsilon)$ has a uniform asymptotic power series in some domain $D$ of $x$ as $\epsilon$ approaches zero with $\epsilon >0$, i.e., \begin{align} u(x,\epsilon) \sim& \sum_{n=0}^{\infty} u_n(x) \epsilon^n & \text{ as } \epsilon\rightarrow&0+& \text{ uniformly for } x\in D. \tag{6}\label{6a} \end{align} We first demonstrate that the asymptotic expansion of $\partial u/\partial x$ (which is equal to $\partial^2 S/\partial x^2$) is in the form obtained by differentiating eq. (\ref{6a}) term by term.

From the Ricatti equation (\ref{4a}), \begin{equation} \frac{\partial u}{\partial x} = -\frac{1}{\epsilon} [u(x,\epsilon)]^2 +\frac{1}{\epsilon} Q(x), \tag{7}\label{7a} \end{equation} and from the assumption (\ref{6a}), $u^2$ also has the asymptotic power series, \begin{align} [u(x,\epsilon)]^2 \sim& \sum_{k=0}^{\infty} U_k(x) \epsilon^k,& \text{ as } \epsilon\rightarrow&0+, \tag{8}\label{8a} \end{align} where \begin{align} U_k(x) =& \sum_{n=0}^k u_n(x) u_{k-n}(x),& k=&0,1,2,\cdots. \tag{9}\label{9} \end{align} From eqs. (\ref{7a}) and (\ref{8a}), it follows that $\partial u/\partial x$ has an asymptotic expansion, \begin{align} \frac{\partial u}{\partial x}(x,\epsilon) \sim& \sum_{k=0}^{\infty} v_k(x) \epsilon^{k-1} & \text{ as } \epsilon\rightarrow&0+, \tag{10}\label{10} \end{align} where \begin{equation} v_0(x) = -U_0(x) +Q(x) = -[u_0(x)]^2 + Q(x), \tag{11}\label{11} \end{equation} and \begin{align} v_k(x) =& U_k(x), & k=& 1,2,3,\dots. \tag{12}\label{12} \end{align}

Note that we are allowed to integrate an asymptotic expansion term by term. That is, from eq. (\ref{10}) it follows that for any $a$ and $b$ in $D$, \begin{align} \int_a^b dx \frac{\partial u}{\partial x} =u(b,\epsilon)-u(a,\epsilon) \sim& \sum_{k=0}^{\infty} \left[\int_a^b dx v_k(x)\right] \epsilon^{k-1},& \text{ as } \epsilon\rightarrow&0+. \tag{13}\label{13} \end{align} Note also that we are allowed to sum two asymptotic expansions, and from eq. (\ref{6a}) it follows that \begin{align} u(b,\epsilon) - u(a,\epsilon) \sim& \sum_{n=0}^{\infty} [u_n(b)-u_n(a)] \epsilon^n & \text{ as } \epsilon\rightarrow&0+. \tag{14}\label{14} \end{align} Since an asymptotic expansion of a function with a given sequence of gauge functions is unique, the expansion coefficients in eqs. (\ref{13}) and (\ref{14}) must agree for each power of $\epsilon$. Especially, the coefficient for the $\epsilon^{-1}$ term in eq. (\ref{13}) must vanish, i.e., \begin{equation} \int_a^b dx v_0(x) =0, \tag{15}\label{15} \end{equation} for all $a$ and $b$ in $D$. This means that \begin{equation} v_0(x) = 0, \tag{16}\label{16} \end{equation} From the comparison of the $\epsilon^n$ ($n\geq 0$) terms between eq. (\ref{13}) and eq. (\ref{14}), \begin{equation} \int_a^b dx v_{n+1}(x) = u_n(b) - u_n(a), \tag{17}\label{17} \end{equation} for all $a$ and $b$ in $D$. This means that \begin{align} v_{n+1}(x) =& \frac{du_n}{dx}(x), & n=&0,1,2,\cdots & \text{ for all } x \in D. \tag{18}\label{18} \end{align} Using eqs. (\ref{16}) and (\ref{18}), we can rewrite eq. (\ref{10}) as \begin{align} \frac{\partial u}{\partial x}(x,\epsilon) \sim& \sum_{n=0}^{\infty} \frac{du_n}{dx}(x) \epsilon^n & \text{ as } \epsilon\rightarrow&0+, \tag{19}\label{19} \end{align} Herewith we demonstrated that the asymptotic expansion of $\partial u/\partial x$ ($=\partial^2 S/\partial x^2$) is in the form obtained by differentiating eq. (\ref{6a}) term by term.

Before moving on to deriving the asymptotic expansion of $S$, let us think about the meaning of eqs. (\ref{16}) and (\ref{18}). Substituting eq. (\ref{11}) into eq. (\ref{16}), we obtain \begin{equation} [u_0(x)]^2 = Q(x), \tag{20}\label{20} \end{equation} or equivalently, \begin{equation} u_0(x) = \pm [Q(x)]^{1/2} \tag{21}\label{21} \end{equation} Substituting eq. (\ref{12}) into eq. (\ref{18}), we obtain \begin{align} U_{n+1}(x) =& \frac{du_n}{dx}(x) & n=&0,1,2,\dots, \tag{22}\label{22} \end{align} which can be rewritten with eq. (\ref{9}) as \begin{align} \sum_{l=0}^{n+1} u_l(x) u_{n+1-l}(x) =&\frac{du_n}{dx}(x), & n=&0,1,2,\cdots , \tag{23}\label{23} \end{align} or equivalently, \begin{align} u_{n+1}(x) =& \frac{1}{2 u_0(x)}\left\{ -\sum_{l=1}^n u_l(x) u_{n+1-l}(x) +\frac{du_n}{dx}(x)\right\}, & n=&0,1,2,\cdots . \tag{24}\label{24} \end{align} We can see that eqs. (\ref{21}) and (\ref{24}) determine the coefficients $\{u_n(x)\}$ of the asymptotic power series of $u(x,\epsilon)$ ($=\partial S/\partial x$) recursively.

Now let us consider the asymptotic expansion of $S(x,\epsilon)$. From eq. (\ref{4a}), we have \begin{equation} S(b,\epsilon) -S(a,\epsilon) = \int_a^b dx u(x,\epsilon) , \tag{25}\label{25} \end{equation} for all $a$ and $b$ in $D$. Since we are allowed integrate an asymptotic power series term by term, from eq. (\ref{6a}), \begin{align} \int_a^b dx u(x,\epsilon) \sim& \sum_{n=0}^{\infty} \left[ \int_a^b dx u_n(x)\right] \epsilon^n & \text{ as } \epsilon\rightarrow&0+, \tag{26}\label{26} \end{align} for all $a$ and $b$ in $D$. From eq. (\ref{25}) and (\ref{26}), we can say that $S(x,\epsilon)$ has an asymptotic power series, \begin{align} S(x,\epsilon) \sim& \sum_{n=0}^{\infty} S_n(x) \epsilon^n & \text{ as } \epsilon\rightarrow&0+, \tag{27}\label{27} \end{align} and \begin{align} S_n(x) =& \int_a^x dt u_n(t)+S_n(a) , & n=&0,1,2,\dots \tag{28}\label{28} \end{align} for all $x$ and $a$ in $D$. From eq. (\ref{28}), we can say \begin{align} u_n(x) =& \frac{dS_n}{dx}, & n=&0,1,2,\dots \tag{29}\label{29} \end{align} Herewith we confirmed that the assumption that $u(x,\epsilon)$ (=$\partial S/\partial x$) can be expanded in a uniform asymptotic power series implies that $S$ is also expanded in the asymptotic power series, and that the former asymptotic series is obtained by term-wise differentiation of the latter asymptotic series.

We can write eqs. (\ref{21}) and (\ref{24}) in terms of $\{S_n(x)\}$ as \begin{equation} \frac{dS_0}{dx} = \pm [Q(x)]^{1/2}, \tag{30}\label{30} \end{equation} and \begin{align} \frac{dS_{n+1}}{dx} =& \left[2 \frac{dS_0}{dx}\right]^{-1}\left\{ -\sum_{l=1}^n \frac{dS_l}{dx} \frac{dS_{n+1-l}}{dx} +\frac{d^2S_n}{dx^2}\right\}, & n=&0,1,2,\cdots . \tag{31}\label{31} \end{align} We see these equations [eqs. (\ref{30}) and (\ref{31})] in many textbooks explaining the WKB approximation.

So, in summary, by assuming the existence of the asymptotic power series for $\partial S/\partial x$, we can justify differentiating it term by term using the Ricatti equation, and we are guaranteed to have the asymptotic power series for $S$ whose term-wise differentiation gives the assumed asymptotic power series for $\partial S/\partial x$.