Relationship between fixed point of differential equation and fixed point of iterated map

646 Views Asked by At

I am just starting Strogatz's Nonlinear Dynamics and Chaos. (Please bear in mind that I'm a high school student - I'm learning this on my own time, not for any sort of school assignment, and my math knowledge isn't too advanced).

For differential equations, Strogatz appears to be defining a "fixed point" as being where $dx/dt = 0$. But when I took Pre-Calculus last year, I believe (if I remember correctly) that when we talked about recursive sequences we defined "fixed point" to be where $x_{n+1} = x_n$.

What is the connection, if any, between the differential equation definition of a fixed point and the sequence definition? I'm finding this quite confusing to think about. I did some research both on this forum and elsewhere but couldn't find anything that completely answered my question in a clear manner.

3

There are 3 best solutions below

5
On

In one case you get a constant solution, in the other a constant sequence when starting in that point, the dynamic "stays fixed" in this point.

In differential equations also the terms "stationary point" and "equilibrium point" are used to make the distinction of these two situations easier.


You get a better conceptual equality if you consider instead of the fixed-point map in $x_{n+1}=g(x_n)$ the increment map in $x_{n+1}-x_n=f(x_n)$ (with obviously $f(x)=g(x)-x$, requiring the space being a vector space). Then also in this situation a fixed point is characterized by $f(x^*)=0$.

Conversely, you can take the Euler method as an approximation of the differential equation $\dot x=f(x)$ to get a discretized iteration $x_{n+1}=g_h(x_n)=x_n+hf(x_n)$. Then stationary points of the ODE correspond to fixed points of this map $g_h$, and again to $f(x^*)=0$.

1
On

Let $X$ be any non-empty set, and let $f \colon X \to X$ be a function mapping $X$ into itself. Then a point $p \in X$ is said to be a fixed point of the function $f$ if and only if $$ f(p) = p. $$

In this definition, we can also take $f \colon X \to Y$, where $Y$ is any set such that $X \subset Y$.

For example, the function $f \colon \mathbb{R} \to \mathbb{R}$, defined by the formula $$ f(x) = x^2 \ \mbox{ for all } x \in \mathbb{R}, $$ has $x = 0$ and $x = 1$ as the only fixed points, because we have $$ f(0) = 0^2 = 0, $$ and $$ f(1) = 1^2 = 1. $$

Now let $X$ be any non-empty set, let $P(X)$ be the power set (i.e. the collection of all the subsets) of set $X$, and let $f \colon X \to P(X)$ be any function mapping $X$ into $P(X)$. Such functions are called set-valued maps. Then a point $p \in X$ is said to be a fixed point of $f$ if and only if $$ p \in f(p). $$ Remember that here $f(p)$ is a set, more precisely a subset of $X$.

In this definition, we can also take functions $f \colon X \to P(Y)$, where $Y$ is any set such that $X \subset Y$.

0
On

You can start with a definition of a dynamical system. Dynamical system is a triple $ \{X,T,\phi^t\}$, where $X$ is called the state space, $T$ is time (an ordered set), and $\phi^t$ is an evolutionary operator that for all allowed $t\in T$ maps $X$ to $X$ and satisfies $\phi^0=Id$ and $\phi^{t+\tau}=\phi^t\circ \phi^\tau$. Do not put too much thought on what I just have defined. If you know what a set and a function are, you should be fine.

Now you start playing with different examples. For instance, let's take $X=\mathbb R$, $T=\{0,1,2,\ldots\}$, and $\phi^1(x)=5x$. Clearly two properties of the evolutionary operator satisfied.

Not always one knows the evolutionary operator explicitly. Sometimes what is known is only the derivative $$ \frac{d}{dt}\phi^t(x)|_{t=0}=f(x). $$ Then it can be shown that $\phi^t$ solves differential equation $\dot x=f(x)$. It is also true in other direction: if one has ODE $\dot x=f(x)$ then its solution is an example of a dynamical system $\{\mathbb R,\mathbb R,\phi^t\}$ (this is a theorem, and not an easy one, so just accept it), even if we do not know exactly the formula for $\phi^t$. Note that now my $T=\mathbb R$. As an example let me take $\dot x=5x$.

Now go back to the general definition of dynamical system and define an orbit through point $x_0$ as the set $\{x\in X\mid x=\phi^t(x_0),t\in T\}$. What is the simplest orbit? Probably the one that consists of just one point $\hat x$. Let me call such orbit (quite naturally) fixed point. How to find it in my first example? Clearly this is such $x$ that satisfies $5x=x$, hence $x=0$

How to find fixed points of my second example? If you reading Strogatz' book, you should be able to prove

Lemma Consider $\dot x=f(x)$ and corresponding dynamical system. Then $\hat x$ is a fixed point if and only if $\dot x=0=f(x)$.

To make the matter even a little more confusing, one can readily replace $X=\mathbb R$ with something more complicated, like the set of all differential functions. We still will have the definition of a fixed point as the orbit consisting of just one point. In this case, however, this point can be a function and can be a solution, say, of a differential equation itself!