Solving linear differential equations using power series

256 Views Asked by At

Given a differential equation in the form (Morris Tenenbaum: Ordinary Differential Equations p(538))

$$ y^{(n)} + f_{n - 1}(x) y^{(n - 1)} + \cdots + f_1(x) y' + f_0(x) y = Q(x) \tag{1}\label{1}$$

which will not be solvable with regular power series method if $f_0, f_1, \cdots, f_{n-1}$ are not analytic at the point we expand the power series around, based on:

Theorem 37.51. If each function $f_0(x), f_1(x), \cdots, f_{n-1}(x), Q(x)$ in (1) is analytic at $x = x_0$, i.e., if each function has a Taylor series expansion in powers of $(x - x_0)$ valid for $|x - x_0| < r$, then there is a unique solution $y(x)$ of (1) which is also analytic at $x = x_0$, satisfying the $n$ initial conditions $$y(x_0) = a_0, \,\, y'(x_0) = a_1, \,\cdots, \,y^{(n-1)} (x_0) = a_{n-1},$$ i.e., the solution has a Taylor series expansion in powers of $(x-x_0)$ also valid for $|x - x_0| < r$.

My question is why? Or, what is the proof for this theorem? And, what if we used the regular power series method? what will happen?

I know that Taylor expansion requires the function to be defined at the point we expand around, but what do the coefficients $f_0, f_1.... f_{n-1}$ have to do with this fact?

1

There are 1 best solutions below

0
On

The author leaves the proof for that theorem and others to Chapters 11 and 12, as explained in Lesson 39C (Series Solution of a Nonlinear Differential Equation of Order Greater Than One), see Theorems 39.12 and 39.32.

The proof you want is given in Theorems 62.12 and 62.22 (in Lesson 62B, page 766), also Theorem 65.2 (Lesson 65, page 783), essentially is consists in showing that every $n$th order differential equation with given initial conditions is equivalent to a system of first order differential equations with respective initial conditions, then it follows from the proof of existence and uniqueness of solutions for that system of first order ODEs, which is a long way to go, so will not reproduce it here.

Not sure what you mean by 'regular power series method', the author provide several examples of solving ODEs by using two power series methods, which he calls 'successive differentiations' and 'undetermined coefficients', and the developments are pretty complete.

If the coefficients $f$ are not analytic at the point of expansion then you can't use the power series method, because the series are valid nowhere, and you can't calculate their values, nor you can speak of a power series solution there. Furthermore, if the coefficients are analytic at the point of expansion, then the resulting series solution will be valid inside a radius of convergence that is at least of the size of the radius of convergence of the expansion of those coefficients (e.g., a polynomial is a finite series, and thus is valid for all $x$, and thus the series solutions of the ODE will be valid for all $x$).