I want to solve an optimization of the form $$\underset{x}{\min}f(x) + g(x),$$ where $f(x)$ is $\mu$-strongly convex and differentiable with a Lipschitz continuous gradient (with Lipschitz constant $L$), whereas $g(x)$ is convex, continuous, and $M$-Lipschitz, but not differentiable. I am using sub-gradient update of the form $$ x_{k+1}=x_k-\alpha_k(\nabla f(x_k) + z_k), $$ where $z_k\in \partial g(x_k)$, and $\left\|z_k\right\|\leq M$ for all $k$. In my case, I cannot compute the proximal operator for $g$. What kind of convergence guarantees and convergence rates can be shown for this problem?
2025-01-13 05:33:57.1736746437
Non Smooth Convex Optimization
408 Views Asked by Subho https://math.techqa.club/user/subho/detail At
1
There are 1 best solutions below
Related Questions in OPTIMIZATION
- How to solve word problems about polynomials given a rectangle and the following I have tried all i know?
- Finding the closest vector to an observation
- if $x\in [2009,2010],y\in [2008,2009]$then $(x+y)(\frac{1}{x}+\frac{a}{y})\ge 9,a>0$ find $a_{min}$
- How do you find the greatest rectangle of given ratios that can be cut from another fixed rectangle?
- Nonlinear Least Squares vs. Extended Kalman Filter
- Maximisation and minimisation of sum of squares, if sum is equal to 15
- quasi-newton method converges in at most n+1 iterations
- Show that $\bf x$ is a basic feasible solution
- Maximizing $3 x^2+2 \sqrt{2} x y$ with $x^4+y^4=1$
- Optimization Question, Finding Maximum and Minimum Values of $30x^2 + 480/x$
Related Questions in CONVEX-OPTIMIZATION
- Let C be a nonempty, closed convex subset of X. Let $x,y\in X$. Show that $y=P_c(x)\iff y\in (Id + N_c)^{-1}(x)$.
- Subderivative of $ ||Au||_{L^{\infty}} $ to Compute Proximal Operator / Prox Operator
- Consider the Hilbert product space $X\times X$
- Legendre transform of a norm
- Projection of a hyperplane
- A complex optimization problem (maximize determinant of matrix)
- The Proximal Operator of a Function with $ {L}_{1} $ Norm and Affine Term
- Optimization function convex or not
- Under what conditions does a convex objective function have a concave value function?
- Holding the constraints of a constrained optimization when transformed into unconstrained optimization
Related Questions in LIPSCHITZ-FUNCTIONS
- What can we say about the function $|f(x)-f(y)|\leq |x-y|^c$ when $c\geq 1$
- g is a Lipschitz on the interval [α-A,α+A] for some A>0. Prove |g'(α)|≤λ
- Is the partial derivative continuous w.r.t. other variables that locally Lipschitz continuous to the function?
- When do we say a function $p$ depends on time?
- Generalization of piece-wise linear functions over a metric space
- Mapping of an open interval under Lipschitz function
- Initial Value Problem, Lipschitz
- Fundamental Existence-Uniqueness Theorem of Nonautonomous Systems
- Relationship between semiconvexity and Lipschitz continuity
- utilizing Grönwall's inequality to prove a property of two solutions of a differential equation
Related Questions in NUMERICAL-OPTIMIZATION
- references: L-BFGS rate of convergence
- Holding the constraints of a constrained optimization when transformed into unconstrained optimization
- Examples where constant step-size gradient descent fails everywhere?
- Convex optimization where both the region and function are ugly
- Separable linear programs
- Minimization optimization - where have I gone wrong?
- Optimization Problem Involving an Integral Equation
- How to solve a system of nonlinear Hamilton-Jacobi PDE's numerically in MATLAB/Maple/other?
- How to replace piecewise objective function in convex optimization problem?
- How would you find the profit maximising level of output of these 2 products?
Related Questions in SUBGRADIENT
- Subgradient at the boundary of a closed set
- Can a subgradient always be found in polynomial time?
- Proximal Operator of the Huber Loss Function
- Proof of unique solution of strongly convex function (Prof. Nesterov Paper)
- Proximal-type support function properties - nonnegative & strongly convex (proof)
- Subgradients of non-convex functions
- More understanding about $E_u[\partial_x h(x,u)]$, $u$ is a random variable
- subdifferential of $\max_{i=1,\cdots,k} x_i+\frac{1}{2}\|x\|_2^2,\ \ \ x\in \mathbb{R}^n$
- Subgradient of a matrix function related to maximum eigenvalue
- integrating sub derivative of convex function
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Refuting the Anti-Cantor Cranks
- Find $E[XY|Y+Z=1 ]$
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- What are the Implications of having VΩ as a model for a theory?
- How do we know that the number $1$ is not equal to the number $-1$?
- Defining a Galois Field based on primitive element versus polynomial?
- Is computer science a branch of mathematics?
- Can't find the relationship between two columns of numbers. Please Help
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- A community project: prove (or disprove) that $\sum_{n\geq 1}\frac{\sin(2^n)}{n}$ is convergent
- Alternative way of expressing a quantied statement with "Some"
Popular # Hahtags
real-analysis
calculus
linear-algebra
probability
abstract-algebra
integration
sequences-and-series
combinatorics
general-topology
matrices
functional-analysis
complex-analysis
geometry
group-theory
algebra-precalculus
probability-theory
ordinary-differential-equations
limits
analysis
number-theory
measure-theory
elementary-number-theory
statistics
multivariable-calculus
functions
derivatives
discrete-mathematics
differential-geometry
inequality
trigonometry
Popular Questions
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- How to find mean and median from histogram
- Difference between "≈", "≃", and "≅"
- Easy way of memorizing values of sine, cosine, and tangent
- How to calculate the intersection of two planes?
- What does "∈" mean?
- If you roll a fair six sided die twice, what's the probability that you get the same number both times?
- Probability of getting exactly 2 heads in 3 coins tossed with order not important?
- Fourier transform for dummies
- Limit of $(1+ x/n)^n$ when $n$ tends to infinity
If you have an upper estimate for the minimal value of $f+g$ you could use a subgradient projection algorithm (e.g. section 29.6, Bauschke & Combettes' 2017 book). Suppose $C = \{x\ | \ f(x) + g(x) \leq \xi\} \neq \varnothing $ for some estimate $\xi \in \mathbb{R}$. Let $s(x)$ be any selection of $\partial(f+g)$. For instance, you could iterate $$ x_{n+1} = \begin{cases} x_n + \frac{\xi - (f(x_n) + g(x_n))}{\|s(x_n)\|^2}s(x_n) &\mbox{if }f(x_n)+g(x_n) > \xi\\ x_n, &\mbox{otherwise.} \end{cases} $$
This algorithm guarantees weak convergence to a point in $C$ with your hypotheses.
BTW, this setup would be perfect for Douglas Rachford splitting if you had a reasonable way to approximate $\text{prox}_g$. Douglas-Rachford also has nice convergence guarantees.