Using residue theorem to integrate from $-\infty$ to $\infty$

147 Views Asked by At

I'm trying to integrate
$$\int_{-\infty}^{\infty} {x^2 \over {(x^2 + 1)}^2(x^2 + 2x + 2)} $$ given that the function
$$f(z) = {z^2 \over {(z^2 + 1)}^2(z^2+2z+2)} $$ has residues
$${9i - 12 \over 100},{3 - 4i \over 25}$$ at the poles $i$ and $-1+i$ respectively. From my understanding of this I have added the two residues (by the residue theorem, and because their respective poles lie in the upper half plane) and multiplied by $2\pi i$ and got an answer of $14\pi \over 100$

Have I done this right?

1

There are 1 best solutions below

0
On BEST ANSWER

You are correct. Formally, you need to evaluate the integral

$$ \oint_{\gamma_R} f(z) \, dz := \int_{-R}^R f(x) \, dx + \int_{C_R} f(z) \, dz $$

where $C_R$ is the curve $Re^{it}$ from $t = 0$ to $t = \pi$, oriented counter clockwise using the residue theorem and then take the limit $R \to \infty$. Since your function behaves like $\frac{1}{z^4}$ where $|z| \to \infty$, you will have

$$ \left| \int_{C_R} f(z) \, dz \right| \leq \int_{C_R} |f(z)| \, |dz| \approx \frac{\pi R}{R^4} \xrightarrow[R \to \infty]{} 0.$$