During the math class of today, we introduced the function $$f(x)=\frac1x, $$ which has a singularity at $x=0$. Since for certain computations it was needed to avoid the singularity, we said something like: "In order to avoid the point $x=0$, suppose to stay away from it assuming, e.g. that $|x|>>0$ or also $x\to +\infty.$"
This is clear for me, but I have a question: If we deal with a function with an infinite set of singularities (points), how it can be reformulated the above idea to avoid each point of singularity?
I am thinking, for example, to a function like $$f(x)=\frac{1}{\sin x}.$$ I think it maybe would be something related to a choice of a suitable $\varepsilon>0$ enough small such that we can consider $x$ when outside the neighborhood of radius $\varepsilon$ and the neighborhood of each singularity never intersect. But actually I am not sure about that.
Could someone please help?
Thank you in advance!
For your example $f(x)=\csc x$, the set of $x$ that is at least $\epsilon\in (0,\pi/2)$ away from any singularity is
$$\bigcup _{n\in \mathbb{Z}} [\pi n+\epsilon,\pi(n+1)-\epsilon]$$