I have to find the solutions in $\mathcal{D}'(\mathbb{R})$ of the differential equation $T'+rT=\delta_0$, where $r\in C^\infty(\mathbb{R})$ and $\delta_0$ is the Dirac distribution. I have tried with an integration factor and several of the methods that I know for ODEs but I am not able to reach anything clear.
Could somebody give me a hint?
Thanks in advance.
The integrating factor way:
$$\frac{d}{dx} \left ( e^{R(x)} T(x) \right ) = e^{R(x)} \delta_0(x) \\ e^{R(b)} T(b) - e^{R(a)} T(a)=\int_a^b e^{R(x)} \delta_0(x) dx = \begin{cases} 0 & 0 \not \in [a,b] \\ e^{R(0)} & 0 \in [a,b] \end{cases}.$$
Here $R$ is any antiderivative of $r$ (but the same one all around).
Note that there will be a discontinuity at $0$ so any IVP should strictly speaking be posed at a point other than $0$.
Another way is to use the fact that the delta function is guaranteed to be confined to the highest derivative in the LHS, so you can assume $T'=\delta_0+f$ and so $T=H+F$ where $H$ is the Heaviside function. Then the original equation becomes
$$\delta_0+F'+rH+rF=\delta_0 \\ F'+rF=-rH$$
which now basically makes sense without distribution theory (provided you do not insist that the equation holds in the classical sense at $0$).