In the most general case, I ask how to solve (either analytically or numerically) this equation for $x(t)$
$$x(t) = \int_{-\infty}^t g(t-\tau) f\big(x(\tau)\big) d\tau$$
where $f, g$ are functions that make this convolution convergent (as an improper integral). In my application, I can further assume $g$ decays exponentially at $+\infty$ and is zero at $t<0$, and $f$ is bounded.
The problem that got me stuck, is that this definition is recursive. I have two ideas:
- Guess a $x_0(t)$, then iteratively use this definition to get $x_1, x_2, \dots$ However the problem is I have to cut off $x$ and $g$ at infinity,and I cannot make sure the error induced by this cutoff is bounded.
- Turn this equation into a differential equation. I prefer this solution, however when I take the derivative of the convolution, it becomes the convolution of a derivative $\frac{d}{dt} g(t-\tau)$, which makes it impossible to remove the integral.
The concrete equation I am trying to solve is from some physics paper equation 4:
$$\Psi(t) = \lambda \int_{-\infty}^t K(t-t') \sin(Vt' + \Psi(t')) dt'$$
where $\lambda, V$ are constants and $K$ is "propagator" (a function of time with positive support).