Can someone help me figure this out please? This is the question along with what I have so far.
Use contour integration to establish $$\int_{x=-\infty}^\infty\frac1{(x^2+a^2)(x^2+b^2)}{\rm d}x=\frac\pi{ab(a+b)},~~~\text{where}~a,b>0$$
Assume $R>1$. Then, the improper integral can be written as $$\lim_{R\to\infty}\int_{_R}^R\frac1{(x^2+a^2)(x^2+b^2)}{\rm d}x$$ Let $\gamma$ be the positively oriented contour consisting of $\gamma_1\cup\gamma_2$, where $\gamma_1=\{z=x+iy\in\Bbb C\mid-R\le x\le R\}$ and $\gamma_2=\{z=Re^{i\theta}\in\Bbb C\mid0\le\theta\le\pi\}$. Define the functions $f(z)=\frac1{(z^2+a^2)(z^2+b^2)}$ and...
Hint: The poles are simple, and located at $\pm ai,\pm bi$. You could use the residue theorem, if you prove the integral on the part of the contour off the real axis goes to zero.