I want to show that
$$\int_0^{\pi} \frac{1}{(a+\cos\theta)^2}d\theta = \frac{a\pi}{(\sqrt{a^2-1})^3}; \;\;a>1$$
I've thought of making the change of variable: $\cos\theta = \frac{e^{i\theta}+e^{-i\theta}}{2}$ and $z = e^{i\theta}$ so that the integral became expressed in terms of $z$. However, I haven't been able to solve any of the new integrals and I'm a bit stuck.
Would I have to solve it by using the residue theorem?
Thanks!