Gradient descent or stochastic gradient descent are frequently used to find stationary points (and in some cases even to local minimum) of a nonconvex function. I was wondering if the same can be said about subgradient method. Can we say that a subgradient method for nonconvex nonsmooth function would find a stationary point too? I know that the set of subgradients can be empty at some points in the nonconvex case, but I was thinking maybe we could use a more generalized definition of gradients such as Clarke subdifferential. I have a nonconvex function which is differentiable almost everywhere. I was wondering, if I could just use such a generalized subgradient method to achieve atleast a local minimum. But I have not been really successful in finding theories that could support my little experiment. Any help would be appreciated!
2026-03-25 06:07:19.1774418839
Subgradient method for nonconvex nonsmooth function
244 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in NONLINEAR-OPTIMIZATION
- Prove that Newton's Method is invariant under invertible linear transformations
- set points in 2D interval with optimality condition
- Finding a mixture of 1st and 0'th order Markov models that is closest to an empirical distribution
- Sufficient condition for strict minimality in infinite-dimensional spaces
- Weak convergence under linear operators
- Solving special (simple?) system of polynomial equations (only up to second degree)
- Smallest distance to point where objective function value meets a given threshold
- KKT Condition and Global Optimal
- What is the purpose of an oracle in optimization?
- Prove that any Nonlinear program can be written in the form...
Related Questions in NON-CONVEX-OPTIMIZATION
- Find largest possible value of $x^2+y^2$ given that $x^2+y^2=2x-2y+2$
- Interesting Global Minimum of a Non-convex Function
- Minimize $x^T A y$, subject to $ x^Ty\geq 0$, where $A=\Phi^T\Phi$ is symmtric and semi-positive definite.
- How should I proceed to solve the below mentioned non-convex optimisation problem?
- Optimization of the sum of a convex and a non-convex function?
- Solution of a semidefinite program with rank at most $k$
- Trace maximization with semi-orthogonal constraint
- Literature on optimizing linear objectives over non-convex sets
- Convex Hull of the Union of Subgradients
- Find matrices $X$ such that the diagonal of $X^H X$ is sparse
Related Questions in SUBGRADIENT
- how to prove the affine composition of the subdifferential?
- why subgradient of convex function is unique
- Subgradients of parametrized LP
- How to show $\partial f(x) =\{\nabla f(x) \}$ for a convex differentiable function?
- What is the subdifferential of the $f(x)$?
- Plot and calculate subgradient of f(x)
- Calculate the convex conjugate of f(x)?
- Proving necessity of a condition for minimum of a convex function
- An optimality condition for constrained problems using subgradients
- A necessary and sufficient condition for optimality of non-smooth convex functions over convex set
Related Questions in NON-SMOOTH-ANALYSIS
- Subdifferential
- Is the Clarke Subdifferential always defined for Lipschitz continuous functions?
- Compute the Clarke subdifferential of a function
- Computing subgradients, a basic example.
- Characterizing subdifferential of nuclear norm of $X^T X$
- Uses of nonsmooth analysis in mathematical research
- Computing subderivative for two variable function
- Gradient of $A \mapsto \sigma_i (A)$
- Subdifferential of $f = \max \{f_1(x), f_2(x) \}$
- Subgradient of a strictly convex function
Related Questions in NON-SMOOTH-OPTIMIZATION
- Second order necessary and sufficient conditions for convex nonsmooth optimization
- Uses of nonsmooth analysis in mathematical research
- Optimization with parametric constraints: solution maps
- Formulate the solution of a zero sum game with a payoff matrix A as an optimization problem.
- Minimizing a composite non-differentiable convex function over a $2$-norm ball
- How to avoid overflow when evaluating the exponential smoothing function?
- Sub-differential of a convex function along a particular direction
- Optimality check for non-differentiable convex function
- Maximum of pseudoconvex function
- Example of empty Clarke subdifferential for function lipschitz over a closed convex set
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
The subgradient method has indeed been utilized to optimize nonconvex nonsmooth functions and this goes back to work done in the Soviet union in the 70s. These methods rely on the assumption that an oracle can provide one subgradient of the objective function, at each point in the domain. The best reference for such methods is probably the monograph of Shor: Minimization methods for non-differentiable functions.
The biggest issue with such methods is that they are non-monotone and so there is no clear way to define stopping criteria. Moreover, the convergence can be poor if the step sizes are chosen offline but adaptive selection of step sizes (tailored to the objective function) would be one possible solution.