Does the magnus convergence test not hold for the factorization of second order differential operators?

34 Views Asked by At

$\DeclareMathOperator{\sech}{sech}\DeclareMathOperator{\csch}{csch}$Given the operator \begin{align} H = V(x)-\partial_x^2 \end{align} and given an eigenfunction $\phi_0(x)$ such that $H\phi_0=0$ with a zero eigenvalue, I can factor $H$ into \begin{align} H = h_+h_- \end{align} where \begin{align} h_{\pm} = \dfrac{\phi_0'}{\phi_0} \pm \partial_x \end{align} If $H\phi_j=-\lambda_j^2\phi_j$, where $\lambda_j$ is a real constant, then $(h_-h_+)(h_-\phi_j)=-\lambda_j^2(h_-\phi_j)$, thus \begin{align} \begin{pmatrix} h_+h_- && 0 \\ 0 && h_-h_+ \end{pmatrix} \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} = - \lambda_j^2 \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} \end{align} This leads to the following operator factorization \begin{align} \begin{pmatrix} 0 && h_+ \\ h_- && 0 \end{pmatrix} \begin{pmatrix} 0 && h_+ \\ h_- && 0 \end{pmatrix} \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} = (i\lambda_j)(i\lambda_j) \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} \end{align} from which I assume \begin{align} \begin{pmatrix} 0 && h_+ \\ h_- && 0 \end{pmatrix} \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} = (i\lambda_j) \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} \end{align} I rearrange terms for Magnus Series compatibility \begin{align} \begin{pmatrix} -(i\lambda_j) && h_+ \\ h_- && -(i\lambda_j) \end{pmatrix} \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} =0 \end{align} \begin{align} \begin{pmatrix} 0 && -1 \\ 1 && 0 \end{pmatrix} \begin{pmatrix} -(i\lambda_j) && h_+ \\ h_- && -(i\lambda_j) \end{pmatrix} \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} =0 \end{align} \begin{align} \begin{pmatrix} -h_- && (i\lambda_j) \\ -(i\lambda_j) && h_+ \end{pmatrix} \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} =0 \end{align} \begin{align} \begin{pmatrix} \partial_x-\phi_0'/\phi_0 && (i\lambda_j) \\ -(i\lambda_j) && \partial_x+\phi_0'/\phi_0 \end{pmatrix} \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} =0 \end{align} \begin{align} \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix}' = \begin{pmatrix} \phi_0'/\phi_0 && -(i\lambda_j) \\ (i\lambda_j) && -\phi_0'/\phi_0 \end{pmatrix} \begin{pmatrix} \psi_j \\ h_-\psi_j \end{pmatrix} \end{align} to setup the Magnus differential equation \begin{align} Y'=AY \end{align} I discover the maximum modules eigenvalues of $A$ to be \begin{align} \|A\|_2 = \sqrt{\left(\dfrac{\phi_0'}{\phi_0}\right)^2+(\lambda_j)^2} \end{align} and then calculate the range of convergence \begin{align} \int_{x_1}^{x_2}\sqrt{\left(\dfrac{\phi_0'}{\phi_0}\right)^2+(\lambda_j)^2} dx < \pi \end{align}

However the calculation of the range of convergence quickly broke down for simple forms of $H$. For example \begin{align} H = -2\sech(x)^2-\partial_x^2 \end{align} has a zero eigenfunction $\phi_0=\tanh(x)$ and only one normalizable eigenfunction $\phi_{-1} = \sech(x)/\sqrt{2}$ with eigenvalue $\lambda_{-1}=1$. The range of convergence is written \begin{align} \int_{x_1}^{x_2}\sqrt{\left(\csch(x)\sech(x)\right)^2+(\lambda_j)^2} dx < \pi \end{align} where $\phi_0'(x)=\sech^2(x)$. This range of convergence fails when $x$ gets too close to $x=0$, however $\phi_{-1}$ is well behaved around $x=0$. What is wrong with this analysis?