I have to estimate the integral $$\int_{-\infty}^\infty \frac{e^{-y^2/2}}{((y+y_0)^2+x_0^2)^r} \,dy,$$ for $r\in \mathbb{R}^+$. I am a little amazed that Sage and Wolfram Alpha have nothing to say about it, and that Gradshteyn-Ryzhik doesn't seem to have anything on it either; it feels like a rather natural integral, the denominator being a distance function.
Of course I realize that, if I just expand $1/((y+y_0)^2+x_0^2)^r$ into a Taylor series around $y=-y_0$ and integrate term-by-term, I am not going to get something convergent; what I get is an asymptotic formula. But what is the right order of magnitude of the error term when the formula gets cut off at the $k$th term?
For instance: let $f(y)=1/((y+y_0)^2+(x_0^2))^r$ and write $$f(y) = f(-y_0) + \frac{(y+y_0)^2}{2} O^*(\max_t f''(t)).$$ Then we get an error term of size $O(1/x_0^{2 r + 3})$. If we go up to a higher-order approximation, we obtain an error term of the form $O(1/x_0^{2(r+k)+1})$ for higher $k$. But can one also give an error term that depends on $y_0$ and not just on $x_0$, and thus is better when $x_0$ is large and $y_0$ is much larger still?
If you want to approximate near $y=-y_0$ you should use what you have. But for farther values of $y$ i.e. $|y+y_0|\gg0$, you should see instead that
\begin{align}\frac1{[(y+y_0)^2+x_0^2]^r}&=\frac1{(y+y_0)^{2r}}\frac1{\left[1+\left(\frac{x_0}{y+y_0}\right)^2\right]^r}\\&=\sum_{n=0}^\infty\binom{-r}n\frac{x_0^{2n}}{(y+y_0)^{2r+2n}}\end{align}
which goes to $0$ as $|y|\to\infty$.