I'm trying to understand how the initial conditions evolve in a delay differential equation (DDE) and am stuck with the following example.
Suppose we have $$\dot x = ax(t)+bx(t-\tau)$$ where constants $a,b$ are such that $a\ne b, b \ne 0$. Now let initial data be $$x(t)=\theta_0, t \in [-\tau,0]$$ where $\theta_0$ constant.
Show that $$x(t) = - \frac{a\theta_0}{b}$$ for $t \in [-2\tau,-\tau]$ so that $x(t)=\theta_0$ for $t \in [-\tau,0]$.
So far, I've come up with the following.
$$\dot x = ax(t)+bx(t-\tau)$$
Assume $$x(t) = - \frac{a\theta_0}{b}, t \in [-2\tau,-\tau]$$ then the DDE has the form
$$\dot x = -\frac{a^2\theta_0}{b} - a\theta_0$$
Once integrated, this becomes $$x(t) = -\frac{a^2\theta_0}{b}t - a\theta_0t + C$$ where C is the constant of integration which can be found by considering the initial condition at the boundary $x(-\tau)= - \frac{a\theta_0}{b}$ and we have $$- \frac{a\theta_0}{b} = -\frac{a^2\theta_0 (-\tau)}{b} - a\theta_0 (-\tau) + C$$ $$C=- \frac{a\theta_0}{b} -\frac{a^2\theta_0 \tau}{b} - a\theta_0 \tau$$
Substituting back into the solution and collecting the terms I get
$$x(t)= -\frac{a\theta_0}{b} -\frac{a^2\theta_0}{b}(t+\tau)-a\theta_0(t+\tau)$$ for the interval $-\tau \le t \le 0$. But this is as far from $x(t)=\theta_0$ as it can be.
What am I doing wrong?
You want to prove that $x(t) = -\frac{a}{b} \theta_0$ for $t\in[-2\tau,-\tau]$ so don't start by assuming it.
Notice that you have been given that $x(t) = \theta_0$ for $t\in [-\tau,0]$. This means in particular that $\dot x(t) = 0 $ for $t\in [-\tau,0]$. Subbing in the differential equation and solving for $x(t-\theta)$ gives the result.