Can we use Power Series Solution for points other than x=0 to escape Frobenius Solution?

1.6k Views Asked by At

Suppose I have an equation

$y" + P(x)y' + Q(x)y = 0$

Now we apply power series when $P$ and $Q$ are analytic at $x=0$ and apply Frobenius method when $P$ and $Q$ are not analytic at $x=0$.

Now, I want to know why do we apply the Frobenius method? We could equally have taken a power series in terms of $(x-a)$, where $P$ and $Q$ would analytic at $a$ and $a$ could be anything i.e $(2, 3 ...100)$. But we don't do that. We always use the Frobenius Method at $x=0$.

So why do we not do that? Is that wrong? Why do we always look for a series centered at $x=0$?

Is it necessary that we find the series centered at $0$? And how does the multiplication of $x^r$ in the Frobenius method with our normal power series correct everything ?

Can we use power series at points other than $0$ or Frobenius Method is the only way out?

2

There are 2 best solutions below

5
On BEST ANSWER

Now we apply power series when $P$ and $Q$ are analytic at $x=0$ and apply Frobenius method when $P$ and $Q$ are not analytic at $x=0$.

Note that the power series is just a special case of Frobenius series. You don't have to decide which of these methods to use — you just can make a shortcut if you see that $P$ and $Q$ are analytic at the point of expansion.

Now, I want to know why do we apply the Frobenius method? We could equally have taken a power series in terms of $(x−a)$, where $P$ and $Q$ would analytic at $a$...

Yes we can expand in powers of $(x-a)$ at any $a$ where $P$ and $Q$ are analytic. But there are some reasons why we usually try to find expansion at $x=0$ (or some other singular point) and not at some other, nonsingular, point:

  • We're often given initial/boundary conditions at $x=0$, and it is natural to choose the expansion which a priori satisfies the condition.

  • We may want to know analytic structure of the solutions at $x=0$. For example, our aim may be to calculate a numeric approximation to the solution. The solution often has a singularity at $x=0$, so that expansion in series at a nonsingular point of the equation will give a slowly convergent series with limited radius of convergence — due to proximity of the singularity. If we instead use Frobenius method, then e.g. for $P$ and $Q$ analytic everywhere except $x=0$ we'll get an everywhere-convergent series. In any case, Frobenius method lets you know asymptotic behavior of your solution, which you can use to speedup numeric calculations.

0
On

For an equation having singularity at $x=0$ (say), you can always obtain a power series solution at $x=a$ ($a\ne 0$). But on doing so, the interval of convergence will be $|x-a|<a$ (as $0$ is the nearest singularity and $a$ is center) or $0<x<2a$ which will not explicitly display the singular behavior of its solution $y(x)$ near $x=0$.

Moreover, for $y''+P(x).y'+Q(x).y=0$; the analyticity of $P(x)$ and $Q(x)$ at the point $x_0$ guarantees the analyticity of its solution $y(x)$ in the neighborhood of $x=x_0$, but to comment correctly about the behavior of solution $y(x)$ in the neighborhood of a point which is a singularity of $P(x)$ or $Q(x)$ we have to rely on Frobenius series.