Differential equations of order infinite, are they ordinary or not?

1.6k Views Asked by At

While defining the general form of an ODE, my instructor stated the following:

$$F(x, y, y^{(1)}, ..., y^{(n)}) = 0, \quad \text{where}\quad n\in \mathbb{N}, n\geq 1 \quad \text{and} \quad y=y(x).$$

However, this implies that the order of an ODE has to be finite.

Question:

First of all, is this really the case? I mean does the "ordinary" mean the order of the differential equations has to be finite? Secondly, is there any differential equation which contains the derivative of infinite order? Can you provide an example where it is used in natural sciences or in just mathematics?

1

There are 1 best solutions below

1
On

There's no such thing as an infinite derivative, so we have to think in terms that actually make sense.

The following is not a rigorous theory (not a theory of any kind), just my own thoughts on the topic. For the theory of such equations the OP could try searching the internet.


For the family of solutions $u_n(x)$ for equations:

$$F_n(x,u,u',\ldots,u^{(n)})=0$$

to converge, we need to 'diminish' the weight of higher derivatives. So:

$$\lim_{n \to \infty} ||F_n(x,u,u',\ldots,u^{(n)})-F_{n+1}(x,u,u',\ldots,u^{(n+1)})||=0$$

Where $|| \cdot ||$ is a norm of some kind, for example an integral of a squared function over the whole domain.

Note what the functions need to be labelled, because they have different number of arguments, and in general are not the same function.

  • One more important thing. The initial conditions. What kind of initial conditions are you going to set for higher and higher order equations?

Let me provide one of the most simple examples. We have the well known Taylor series for an infinitely differentiable function:

$$f(x)=f(a)+f'(a)(x-a)+\frac{f''(a)}{2}(x-a)^2+\dots+\frac{f^{(n)}(a)}{n!}(x-a)^n+\dots$$

Now imagine that you don't actually know the function, but know its value at some $x=x_0$, then:

$$f_0=f(x_0)=f(a)+f'(a)(x_0-a)+\frac{f''(a)}{2}(x_0-a)^2+\dots+\frac{f^{(n)}(a)}{n!}(x_0-a)^n+\dots$$

Or, rewriting, we have an 'infinite ODE' in the variable $a$ for an unknown function $f$:

$$f(a)-f_0+(x_0-a)\frac{df}{da}+\frac{(x_0-a)^2}{2}\frac{d^2 f}{da^2}+\dots+\frac{(x_0-a)^n}{n!} \frac{d^n f}{da^n}+\dots=0$$


But the dots before the equality sign make no sense! So the actual problem would be a sequence of equations:

$$f(a)-f_0=0$$

$$f(a)+(x_0-a)\frac{df}{da}-f_0=0$$

$$ \cdots $$

$$f(a)+(x_0-a)\frac{df}{da}+\frac{(x_0-a)^2}{2}\frac{d^2 f}{da^2}+\dots+\frac{(x_0-a)^n}{n!} \frac{d^n f}{da^n}-f_0=0$$


For a nice enough function (if the absolute value of the derivatives doesn't grow too fast with $n$), the equations converge in the sense stated above.

Which means, that if we solve the equation for large enough $n$, we can get a good approximation to the supposed solution for $n \to \infty$.


But the initial conditions pose quite another problem. You need to set up an infinite sequence of initial conditions, which allow us to find the functions at each step, and also lead to a convergent solution.

For example, let $x_0=0$ and $f_0=1$ in the above sequence of equations. Then we get:

$$f_1-a f_1'=1 \\ f_1=C_1 a+1$$

$$f_2-a f_2'+\frac{a^2}{2}f_2''=1 \\ f_2=C_2 a^2+C_1 a+1$$

And so on. We are getting a general power series. The only way to specify the solutions, is to set $C_n=g(n)$, with $g(n)$ decreasing fast enough.


These are some thoughts on the question. I'm not sure what kind of applications such equations have if any. But I'm sure there's a general theory in some literature.