Prove that $\int_{\mathbb{R}} \frac{1}{(a^2+t^2)(b^2+t^2)} dt= \frac{\pi}{ab(a+b)}$
For $a,b>0$, the Fourier transform of $f(x) = e^{-a|x|}$ is $\widehat{f}(t) = \dfrac{2a}{a^2+t^2}$.
Otherwise, we have $\int_{\mathbb{R}}\dfrac{1}{(a^2+t^2)^2} dt= \dfrac{\pi}{2a^3} $.
I have to show that $\int_{\mathbb{R}} \dfrac{1}{(a^2+t^2)(b^2+t^2)} dt= \dfrac{\pi}{ab(a+b)}$ ?
I have no idea to do it. Could someone help me ?
Hint: $$\frac{1}{(a^2+t^2)(b^2+t^2)}=\frac{1}{b^2-a^2}\left(\frac{1}{a^2+t^2}-\frac{1}{b^2+t^2}\right).$$