What I call as ‘black hole’, has a formal name of ‘limit point of singularities’.
Suppose $k$ is a black hole of the function $f$ (e.g. $0$ is the black hole of $\csc \frac1z$), then how to evaluate an infinitely small loop integral around it (assuming the integral exists)?
The problem should be stated more precisely:
Assume that $s_1,s_2,\cdots$ are singularities of $f(z)$ near $z=k$ (the singularities are ordered by the distance from $k$, in descending order).
($f$ can be assumed to be holomorphic everywhere else in $|z-k|<|s_1-k|$.)
Define $r_n=\frac{|s_n-k|+|s_{n+1}-k|}2$.
Then, how to evaluate $$\lim_{n\to\infty}\oint_{|z-k|=r_n}f(z)dz$$, assuming the integral exists?
I can show that the integral occasionally exists.
By residue theorem, $$\oint_{|z-k|=r_1}f(z)dz-\oint_{|z-k|=r_n}f(z)dz=2\pi i\sum\text{Res}$$
The first integral clearly exists, because it is an integral of a continuous function over a rectifiable curve.
The sum of residue may converge as $n\to\infty$. For example, $\cot\frac1z$ has residue $-\frac1{(n\pi)^2}$ at $z=\frac1{n\pi}$. Summing all of them produces something finite.
If the infinite sum of residue converges, then the limit of thr second integral ($n\to\infty$) necessarily exists.
But how to evaluate it?
Any hints/ideas/suggestions are welcomed.
Thanks in advance.
The residues at $s_n$ don't determine the integrals, nor their limit (if it exists).
Example: $f(z) = \cot(1/z)$ and $g(z) = \cot(1/z) + 1/z$ have the same poles $s_n = 1/(n\pi)$ and the same residues, but $\oint_C f(z)\; dz$ and $\oint_C g(z)\; dz$ differ by $2\pi i$ for any simple positively-oriented closed contour surrounding $0$ and missing the poles.