Does a holomorphic function $f : D'(0, 1) \to \mathbb{C}$ have a removable singularity at $0$ if it satisfies a Morera-like property?

53 Views Asked by At

By $D'$ I will denote the unit disc $D := D(0,1)$ minus the point $\{0\}$.
Consider $f$ be a holomorphic complex-valued function on $D'$.

My question would be to help (dis)prove the following:

If, for every closed triangle $T \subset D$ such that $0$ is not on the boundary $\partial T$, $f$ satisfies: $$\int_{\partial T} f(z)\mathrm{d}z = 0$$ then $f$ has a removable singularity at $0$.

Those integrals are well-defined since $f$ is holomorphic on $D'$ and $0$ is assumed to not be on the edges of the triangle, but $0$ can be in the interior triangle, hence this is stronger than the assumptions of Morera's theorem (which would have provided nothing of value here since $f$ is already assumed holomorphic on $D'$).

This would be interesting to me is because I have a sequence of holomorphic functions $f_n$ on a disc (the radius shouldn't matter so I normalised it for this post) which converge pointwise outside of $0$ to a holomorphic function but such that: $$\int_{\partial T} (f - f_n)(z) \mathrm{d}z \quad\xrightarrow[n \to \infty]{}\quad 0$$ for the aforementioned triangles $T$ such that $0$ is not on the edges.
Therefore, having the condition $\int_{\partial T} f_n(z)\mathrm{d}z$ for each $n$ (it's just Goursat's lemma, applicable for all triangles because this time $f_n$ is holomorphic on the whole disc), we obtain $\int_{\partial T} f(z)\mathrm{d}z = 0$. And I was wondering if that would be enough to prove that $f$ has a removable singularity, hence this post.

I feel like it's too strong a condition, given how "rigid" the theory of holomorphic functions already is, to not imply that $f$ is holomorphic, but of course I could be wrong. This almost resembles Morera's theorem, hence the "Morera-like property" name given in the title, in that the conditions for Morera's theorem would be fulfilled if $f$ was continuous at $0$ and the triangles with $0$ on their boundary were included, which aren't "that many triangles" in a way. I've tried taking a proof of Morera's theorem (for example this one: Proof of Morera's Theorem for Triangular Contours) and adapting it to my case to show that $f$ has a holomorphic antiderivative on the whole disc and thus is holomorphic as derivative of a holomorphic function, but I don't quite see how to avoid the singularity at $0$ to construct such an antiderivative. Yet maybe I'm missing something simple.

Finally, I've tried looking at older posts and I haven't seen my exact situation or anything close enough, but please feel free to flag this post as duplicate (or link to another thread if it's on MathOverflow for example) if I've overlooked a post covering this question. I've chosen triangles for this question, however I wouldn't mind seeing what happens with other families of contours: for example, maybe the claim is wrong for triangles but works for circles, or maybe you need the assumption to hold for every closed curve not meeting $0$, who knows? Do not feel obligated though.

1

There are 1 best solutions below

2
On BEST ANSWER

By the residue theorem, $$ \int_{\partial T} f(z)\mathrm{d}z = 2 \pi i N(\partial T, 0) \operatorname {Res}(f, 0) $$ where $N(\partial T, 0)$ is the winding number of $\partial T$ with respect to the origin, and $\operatorname {Res}(f, 0)$ is the residue of $f$ at the origin.

So any holomorphic function in $D'$ whose residue at the origin is zero has that property, and it does not follow that $f$ has a removable singularity.

A simple example is $f(z) =1/z^2$.