I am working on a problem where I have the following ODE. $$m\dot{v}+bv=\delta_I(t)$$ where $$\delta_I(t)=\begin{cases}0, & \text{for}&t\ne0\\ I, & \text{for} &t=0\end{cases}.$$ The solution $v(t)$ was derived using Laplace transforms, the ODE in the Laplace domain is (with $0$ initial conditions)$$(ms+b)V(s)=I$$ giving $$v(t)=\frac{I}{m}e^{-bt/m}.$$ How does this solution satisfy the original ODE though? At $t\ne0$ everything is good,$$-\frac{Ib}{m}e^{-bt/m}+\frac{Ib}{m}e^{-bt/m}=0,$$ while at $t=0$, $$-\frac{Ib}{m}+\frac{Ib}{m}=I$$ the result seems to be saying $0=1$ which is obviously false. What am I missing here?
2026-02-22 18:49:02.1771786142
On
Solution to ODE with Dirac Delta satisfies ODE
2.1k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
The reason why you got a wrong answer is because this ODE isn’t properly defined at t=0. The dirac delta function itself is singular at t=0. So is the ODE at t=0, holding no responsibility for t=0. The ODE only cares about the solution at t>0,. The only piece of precise info the dirac delta provides is that its time integral is 1.
So I start by find the solution to the homogeneous eqn: $$my'+by=0 \to y_h=Ce^{-bt/m}$$
Then, we can determine the inhomogeneous part using a Fourier transform of the original eqn: $$my'+by=\delta(t) \to i\omega m \tilde{y}+b\tilde{y}=1 \to \tilde{y}=\frac{1}{m}\frac{1}{b/m+i\omega}$$
From a table of Fourier transforms $$\frac{1}{b/m+i\omega} \to \theta(t)e^{-bt/m}$$ where $\theta(t)$ is the heaviside step function.
So, $f_i= \theta(t)e^{-bt/m}/m$, which when combined with the homogeneous solution will give the general solution for this differential equation.