I wanted to read some introductory material about dynamical systems since I might need a basic understanding of them in a related task. So, as far as I see, in a continuous time dynamical system, we have a state vector, $X(t)%$, whose entries are individually dependent on time : $X(t) = (x_1(t),x_2(t),\dots,x_n(t))$. Then we have time derivatives of each state, which are dependent on other states at $t$ and optionally directly on $t$, for example: $$\dfrac{d x_i(t)}{d t} = F_i(x_1(t),\dots,x_i(t).\dots,x_n(t),t)$$
My first question is, what forms of $x_i(t)$ functions do exist which show that kind of derivative behavior: We simply take its time derivative and it suddenly becomes a very complex function which is dependent on all other $x_1(t), x_2(t), \dots, x_n(t)$ functions? Are they somehow involving exponential ($e^{rt}$) terms, whose derivatives contain them again?
My second question is about integrating the time derivatives and a little bit complicated to state clearly: From the definition of $\dfrac{d x_i(t)}{d t}$, we can write:
$$x_i(t) = \int_{-\infty}^{t}F_i(x_1(\tau),\dots,x_i(\tau),\dots,x_n(\tau),\tau)d\tau$$ using the Fundamental Theorem of Calculus. Now, since $F_i$ is a function of $x_i(t)$ again, we can replace $x_i(\tau)$:
$$x_i(t) = \int_{-\infty}^{t}F_i(x_1(\tau),\dots,\int_{-\infty}^{\tau}F_i(x_1(\alpha),\dots,x_i(\alpha).\dots,x_n(\alpha),\alpha)d\alpha,\dots,x_n(\tau),\tau)d\tau$$ Then we can replace $x_i(\alpha)$, which can go infinitely. So, how is this "recursive" behavior is justified? I did not see a similar situation before, so this left me confused: How can $x_i(t)$ have a valid definition in this case?
Usually you start with a dynamical system and try to find the solution functions. That is, the changes over small time steps are known and get composed and abstracted into an ODE system. For instance, Newton postulated that all bodies move linear without exterior influence, which is captured with $\ddot x=0$, and anything that changes that is a force, so $m\ddot x = F(x,t)$ follows from first principles.
The second topic is the Picard iteration resp. fixed point integral equation that is usually written as $$ x(t)=x_0+\int_{t_0}^t f(x(s),s) ds. $$ See the theorems of Peano and Picard-Lindelöf.