Showing that an integral of a curve in $\mathbb{C}$ vanishes when the parameter approaches infinity

35 Views Asked by At

I'm trying to solve a problem where you have to use the residue theorem in order to get the value of a certain integral, but I cannot go on from this point:

I need to show that $\int_{0}^\pi f(R+it)dt$ and $\int_\pi^0 f(-R+it)dt$, where $$f(z)=sinh(z)tanh(z)sin(\omega z),\,\omega\in\mathbb{R},$$ tend both to $0$ as $R$ tends to $+\infty$. I guess that the $\sin(\omega(\pm R+it))$ can be neglected since it can be bounded somehow, and I'm trying to play with the terms $e^{\pm R}$ in order to get something, but it always gets messy.

Thank you!