I have to evaluate the following integral using the residue theorem. $$ \int_{|z-i|=3} \frac {dz}{(z^2+2)(z+1)}.$$ First I found that there are three singularities, all 1st order poles. These are at $-1$, $\sqrt2i$ and $-\sqrt2i$.
My path is a circle centered at $i$ with a radius of 3. This means that I should sum up the residue of all three residues and none of them will be ignored correct?
I'm learning the residue theorem now. When I'm applying it to one singularity at a time, will I be using the whole polynomial $ \frac 1 {(z^2+2)(z+1)}$ and applying the shortcut where I take the derivative of the bottom and leave the numerator unchanged and then plug in my singularity value to calculate the residue at that point? Thanks!
You are correct, the region $|z-i|\leq 3$ encloses all the poles of $\frac{1}{(z^2+2)(z+1)}$. In particular
$$ \oint_{|z-i|=3}\frac{dz}{(z^2+2)(z+1)} = \oint_{|z|=R}\frac{dz}{(z^2+2)(z+1)} $$ for any $R$ large enough (both sides are the same sum of residues). On the other hand the length of $|z|=R$ is $2\pi R$ and the function $\frac{1}{(z^2+2)(z+1)}$ is $\ll\frac{1}{R^3}$ on $|z|=R$, hence the RHS is arbitrarily close to $0$. In particular we may state $$ \oint_{|z-i|=3}\frac{dz}{(z^2+2)(z+1)} = 0$$ without actually computing the residues of $f(z)$ at $\pm i\sqrt{2}$ and $-1$.