How can I minimize a function $f(x)$ subject to $f(x) \geq 0$?
\begin{align} \min_x f(x)\\ \text{subject to } f(x) \geq 0\\ x \in \mathbb{R}^n \end{align}
Function $f(x)$ is continuous and twice-differentiable, but can be nonlinear, e.g., $$f(x) = C + \frac{1}{1-x}$$ where $C$ is a constant.
In general, this is a very difficult problem. If you have any more information about $f$, i.e. convex, nonnegative, linear, etc., then it will probably simplify the problem. If $n=1$, then this problem is also quite simple. In general, this problem is solvable, but it may break down into exponentially many sub-problems.
If $x_{min}\in \Omega \subset \mathbb{R^n}$ is a minimum of $f(x)$ then $\textbf{D}f(x_{min})=0$ or $x_{min} \in \partial \Omega$ (the boundary of $\Omega$). Since $f(x)$ is continuous and $\Omega=\{x\in\mathbb{R^n}|f(x)\geq0\}$, then whenever $x_\partial \in \partial \Omega$, $0 = f(x_\partial) \leq f(x)$ for any $x \in \Omega$. Thus, if $f(x) < 0$ for any $x\in \mathbb{R^n}$, $\partial \Omega$ is nonempty, so the minimum of $f(x)$ on $\Omega$ is $0$.
We can easily check when this is the case as $f(x) < 0$ if and only if $\min[\lim_{||x||\rightarrow \infty} f(x)]<0$ or $f(x^*)<0$ for some $x^*$ such that $\textbf{D}f(x^*)=0$. If all of these are nonnegative, then $f(x)>0$ for all $x\in\mathbb{R^n}$. Otherwise, the minimum is $0$.
So, let's consider that $f(x) > 0$ for all $x\in\mathbb{R^n}$, so that $\partial \Omega = \emptyset$. Then, we can find $x_{min}^* = \arg\min_{\textbf{D}f(x^*)=0} f(x^*)$. To verify that $f(x_{min}^*)\leq f(x)$, we need to check that $f(x_{min}^*) \leq \min[\lim_{||x||\rightarrow \infty} f(x)]$. If this is the case, then $x_{min}^*$ minimizes $f$. Otherwise, $f$ has no global minimum and $\inf f = \min[\lim_{||x||\rightarrow \infty} f(x)]$
To calculuate $\min[\lim_{||x||\rightarrow \infty} f(x)]$, write $x = x(r,\theta)$ where $\theta \in [0,\pi]^{n-2}\times [0,2\pi)$ as a point on a sphere of radius $r$, and take $\lim_{r\rightarrow \infty} f(x(r,\theta)) = g(\theta)$. Minimizing $g$ may be challenging but is a reduction of order from minimizing $f$ and has a compact, convex domain.
If $f$ is a nonnegative convex function, then the minimum is unique and there are no maxima or saddles, so finding any $x^*$ such that $\textbf{D}f(x^*)=0$ guarantees that $f(x^*)<f(x)$ for all $x$.