How to find unconstrained minimizer of $f(x)$, $f$ convex when the gradient is not invertible and $0 \not\in \text{im}(\nabla f(x))$?

440 Views Asked by At

Let $f:\mathbb{R}^n \to \mathbb{R}$ be a convex, differentiable function. We wish to find a minimizer to $f$.

Consider the unconstrained minimization problem:

$$\min_{x \in \mathbb{R}^n} f(x)$$

By the first order condition, we should have $\nabla f(x^*) = 0$, where $x^*$ is the global optima

However, in many cases $\nabla f$ does not have an inverse (so we cannot use $x^* = \nabla f^{-1}(0)$), and $0 \not\in \text{im}(\nabla f(x))$, where $\text{im}$ denotes the image of $\nabla f$.

A concrete example similar to this question is $f(x) = \log(\sum\limits_{i = 1}^n e^{\beta x_i}), \beta > 0$, in this case, $0 \notin \text{im}(\nabla f(x))$, and $\nabla f(x)$ is not a bijection. How do we find minimizer over all $\mathbb{R}^n$?

How do we solve for the minimizer in these cases?

1

There are 1 best solutions below

0
On BEST ANSWER

For a differentiable convex function $x_{*}$ is a global minimum if and only if $ \nabla f(x_*) = 0 $.

So you only need to solve the equation $\nabla f(x) = 0.$ If it doesn't have any solution this means $f$ does not have minimum !