Intuitive explanation of why some autonomous differential equations go to infinity in finite time

2.6k Views Asked by At

Take any differential equation of the form

$$\frac{dy}{dx}=y^n$$

where $n > 1$. The solution $y(x)$ will reach infinity at a finite value of $x$.

Assuming $y_0 =1 $ for all cases, here are a few examples: $$\frac{dy}{dx}=y^2$$ has the solution $$y=\frac{-1}{x-1}$$
which reaches its asymptote at $x=1$.


The DE $$\frac{dy}{dx}=y^{1.01}$$ has the solution $$y=\left(\frac{-100}{x-100}\right)^{100}$$
which reaches its asymptote at $x=100$.


If you take any DE of the form $$\frac{dy}{dx}=y^{1 + \epsilon}$$ where $\epsilon$ is a very small number, the solution is $$y=\left(\frac{-1}{\epsilon(x-\frac{1}{\epsilon})}\right)^{\epsilon^{-1}}$$
which eventually hits the vertical asymptote at the very large number $\frac{1}{\epsilon}$


This has always bugged me. Intuitively, one expects that the solutions to these equations will grow rapidly and aggressively, much faster than the exponential function. But it is not entirely obvious why they should reach an infinite value after a finite time, instead of say, grow like the Ackermann function or some other function that grows rapidly but stays strictly finite.

Is there an intuitive argument for why these DEs are able to reach infinity in a finite timespan?

5

There are 5 best solutions below

4
On BEST ANSWER

Your intuition that a solution to a DE like this should grow quickly but finitely makes a lot of sense. One justification for this intuition is to look at the estimation Euler's method would give: entirely finite and defined for the whole real line. To fix this inaccurate intuition, consider the following improvement of Euler’s method: instead of increasing x a constant amount each time, only increase x far enough to let y double. Since y doubles with each jump, $y^n$ Increases by $2^n$, so the ratio of the size of the horizontal jump from one jump to the next decreases by a factor of $\frac {2}{2^n}$. since n>1, this ratio is less than one. As a result the x-position converges, so y is doubling with out bound but x converges.

4
On

Its because $\int_1^\infty \frac{1}{y^p} dy$ is finite for $p>1$, but infinite for $p=1$.

2
On

There is a nice discussion of this problem here p. 423, where the authors show by example that what one expects is not necessarily what happens. Below is a sketch of their proof of a criterion which can be used to tell whether a solution will blow up in finite time. Namely, we have a

Theorem:

if $y'=f(y);\ y(0)=y_0;\ f(y)>0$ for all $y>y_0,$ then $y$ blows up at time $t_1$ if and only if $\int^{\infty}_{y_0}\frac{1}{f(y)}dy=t_1.$

For the proof, note that $\int^{y(t)}_{y_0}\frac{1}{f(u)}du=t$ whenever the integral is defined. Therefore, if $y$ satisfies $\underset{t\to t_1^-}\lim y(t)=\infty$ then $\underset{t\to t_1^-}\lim \int^{y(t)}_{y_0}\frac{1}{f(u)}du=\underset{t\to t_1^-}\lim t=t_1.$

On the other hand, if the integral converges to $t_1,$ then $t=\int^{y(t)}_{y_0}\frac{1}{f(u)}du<\int^{\infty}_{y_0}\frac{1}{f(u)}du=t_1$ so $t$ is bounded by $t_1$. Conclude by observing that

$\underset{t\to t_1^-}\lim \int^{y(t)}_{y_0}\frac{1}{f(u)}du=\underset{t\to t_1^-}\lim t=t_1=\int^{\infty}_{y_0}\frac{1}{f(u)}du$ so $\underset{t\to t_1^-}\lim y(t)=\infty.$

0
On

The point is that $dy/dx = y^p$ is equivalent to $dx/dy = y^{-p}$, i.e. instead of thinking of $y$ as the dependent variable and $x$ as independent, do the reverse. If you think of $x$ as position and $y$ as time, the velocity is $y^{-p}$. If $p > 1$, this goes to $0$ fast enough that the change in $x$ as $y$ goes from some finite positive value to $\infty$ is finite. Now change point of view again and it says that as $x$ goes to some finite value, $y$ goes to $\infty$.

0
On

I thought I'd come back to this and give an answer that satisfies the original intent of the question as I asked it.

Differential equations are infinitely responsive.

The differential equation reacts infinitely fast to any changes in the equation. Simply put, this allows for the possibility of an infinite amount of change occurring in a finite period. If you alter $a$ in $a \frac{dy}{dt}$, you'll see a change instantaneously, at time $t$. This is a major point of difference to numerically simulated DEs, because the action of $dy/dt$ only takes effect at $t +\Delta t$.

Physical reality is not infinitely responsive.

Every force or interaction between one particle and another in this universe can only propagate - at maximum - at the speed of light. In the vast majority of cases, the interaction propagates much more slowly than this.

I guess the issue for me came when conflating 'differential equations as a tool for understanding the physical universe', with 'differential equations as a mathematical construct that are under no obligation to make physical sense'.

If you numerically approximate $dy/dx = y^2$ using the Euler method on your laptop, you will indeed see $y$ rapidly increase beyond the confines of the 32-bit floating point format, but $y$ will still always stay finite. It's easy to prove this - The difference between timesteps is finite, therefore its square in the next timestep will be finite.

When you look at the role differential equations play in physics, they are often in the context of treating an uncountably large number of particles as a continuous object. Once the scales become smaller than individual particles, the assumptions will become unphysical, and so will the results of the DE.