The Cauchy distribution $$ C(x) = \frac{1}{\pi} \frac{\gamma}{(x-\omega)^2 + \gamma^2} $$ approaches the Dirac delta distibution as $\gamma \to 0$. What is the $O(\gamma)$ correction to this? I am thinking it should involve a Cauchy principal value, based on considering two cases:
Case 1: $x \ne \omega$
Off resonance, $C(x)$ has a convergent series expansion for $|\gamma| < |x - \omega|$ $$ C(x) = \frac{1}{\pi} \frac{\gamma}{(x-\omega)^2} + O(\gamma^3) $$
Case 2: $x = \omega$
On resonance, $C(x)$ diverges as $\gamma \to 0$ as $$ C(x) \sim \frac{1}{\pi\gamma} $$ But this is the same divergent behavior that leads us to identify the Dirac delta distribution as the $\gamma \to 0$ limiting behavior.
These two cases lead me to want to write $$ C(x) \sim \delta(x-\omega) + \frac{\gamma}{\pi} P \frac{1}{(x-\omega)^2} $$ where $P$ means to take the Cauchy principal value. But if I use this in expansion in real calculations, will I recover correct $O(\gamma)$ behavior? Or have I made some subtle mistake?