In one dimension, one can consider improper integrals of the form $$\int_a^b \frac{1}{f(x)} \mathrm d x$$ where $f$ is a polynomial with a zero at $x=a$, and $f>0$ on $(a;b]$. That integral will be always divergent, because of the singularity at $x=a$.
Now, I was wondering if this continues to be true in two dimensions.
Let $f(x,y)$ be a polynomial in two variables, and let $\Omega \subset \Bbb R^2$ be a bounded connected open set. Suppose that $\gamma=\{ (x;y) \in \Omega : f(x;y)=0 \}$ is a curve which cuts $\Omega$ into two nonempty regions $$\Omega^+ = \{ (x;y) \in \Omega : f(x;y)>0 \} \quad \mbox{ and } \quad \Omega^- = \{ (x;y) \in \Omega : f(x;y)<0 \}$$ The integral $$I= \int_{\Omega^+} \frac{1}{f(x;y)} \mathrm d x \mathrm d y$$ is improper, since the integrand function is unbounded in proximity of $\gamma$.
Does the integral $I$ ever converge, or is it always divergent?
I tried with some examples, like $f(x;y)=1-x-y$ and $f(x;y)=1-x^2-y^2$ inside the rectangle $(0;1)^2$. I also tried $f(x;y)=1-xy$ inside the rectangle $(0;2)^2$, and always got divergent integrals.