We are working on an optimization problem that involves using thresholds in a real-world decision algorithm. As of right now, my colleague and I are stumped on finding a differentiable mathematical function that maps any number less than $0$ to $0$ (or $-1$) and any number greater than or equal to $0$ as $1$. The sign operation (https://en.wikipedia.org/wiki/Sign_(mathematics)) is non-differentiable and in our case, not usable. Other attempts will return numbers extremely close to $1$ or $0$ but are still non-differentiable after applying a floor or ceiling operation. If any such function or combination of functions exist, a point in the right direction here would be immensely appreciated.
2026-03-31 15:50:47.1774972247
Differentiable "Signum" Function or "Step" Function for Gradient Descent
634 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
Related Questions in DERIVATIVES
- Derivative of $ \sqrt x + sinx $
- Second directional derivative of a scaler in polar coordinate
- A problem on mathematical analysis.
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Does there exist any relationship between non-constant $N$-Exhaustible function and differentiability?
- Holding intermediate variables constant in partial derivative chain rule
- How would I simplify this fraction easily?
- Why is the derivative of a vector in polar form the cross product?
- Proving smoothness for a sequence of functions.
- Gradient and Hessian of quadratic form
Related Questions in OPTIMIZATION
- Optimization - If the sum of objective functions are similar, will sum of argmax's be similar
- optimization with strict inequality of variables
- Gradient of Cost Function To Find Matrix Factorization
- Calculation of distance of a point from a curve
- Find all local maxima and minima of $x^2+y^2$ subject to the constraint $x^2+2y=6$. Does $x^2+y^2$ have a global max/min on the same constraint?
- What does it mean to dualize a constraint in the context of Lagrangian relaxation?
- Modified conjugate gradient method to minimise quadratic functional restricted to positive solutions
- Building the model for a Linear Programming Problem
- Maximize the function
- Transform LMI problem into different SDP form
Related Questions in PARTIAL-DERIVATIVE
- Equality of Mixed Partial Derivatives - Simple proof is Confusing
- Proving the differentiability of the following function of two variables
- Partial Derivative vs Total Derivative: Function depending Implicitly and Explicitly on Variable
- Holding intermediate variables constant in partial derivative chain rule
- Derive an equation with Faraday's law
- How might we express a second order PDE as a system of first order PDE's?
- Partial derivative of a summation
- How might I find, in parametric form, the solution to this (first order, quasilinear) PDE?
- Solving a PDE given initial/boundary conditions.
- Proof for f must be a constant polynomial
Related Questions in NUMERICAL-OPTIMIZATION
- Modified conjugate gradient method to minimise quadratic functional restricted to positive solutions
- Bouncing ball optimization
- Minimization of a convex quadratic form
- What is the purpose of an oracle in optimization?
- What do you call iteratively optimizing w.r.t. various groups of variables?
- ProxASAGA: compute and use the support of $\Delta f$
- Can every semidefinite program be solved in polynomial time?
- In semidefinite programming we don't have a full dimensional convex set to use ellipsoid method
- How to generate a large PSD matrix $A \in \mathbb{R}^{n \times n}$, where $\mathcal{O}(n) \sim 10^3$
- Gram matrices in the Rayleigh-Ritz algorithm
Related Questions in GRADIENT-DESCENT
- Gradient of Cost Function To Find Matrix Factorization
- Can someone explain the calculus within this gradient descent function?
- Established results on the convergence rate of iterates for Accelerated Gradient Descent?
- Sensitivity (gradient) of function solved using RK4
- Concerning the sequence of gradients in Nesterov's Accelerated Descent
- Gradient descent proof: justify $\left(\dfrac{\kappa - 1}{\kappa + 1}\right)^2 \leq \exp(-\dfrac{4t}{\kappa+1})$
- If the gradient of the logistic loss is never zero, does that mean the minimum can never be achieved?
- How does one show that the likelihood solution for logistic regression has a magnitude of infinity for separable data (Bishop exercise 4.14)?
- How to determinate that a constrained inequality system is not empty?
- How to show that the gradient descent for unconstrained optimization can be represented as the argmin of a quadratic?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
You will never find a differentiable function $f$ such that $f(x) = 0$ for $x<0$ and $f(x) = 1$ for $x\geq 0$. What you may be interested in are sigmoid functions, which approximate a function with such properties and are often used in, say, machine learning for exactly your purpose of a differentiable approximation to the function you want.
One example is the logistic function $$f(x) = \frac{1}{1+e^{-x}},$$ often used in machine learning. Note that we can also define the family of functions $$f_a(x) = \frac{1}{1+e^{-ax}}$$ for $a\geq 0$. Then as $a\to\infty$, $f_a$ converges pointwise to a function that is 0 for $x< 0$ and 1 for $x\geq 0$.
If instead you want a function that is $-1$ for $x<0$ and $1$ for $x\geq 0$, then you can try the hyperbolic tangent $$ g_a(x) = \tanh(ax),$$ again for $a>0$. Then again, as $a\to\infty$, $\tanh (ax)$ converges pointwise to what you want.
Note that there is a slightly subtlety that I have deliberately left out above: for all $a$, we actually have $f_a(x) = 1/2$ and $g_a(x) = 0$, which isn't exactly what we want. But depending on your application, it is numerically unlikely that the argument of $f_a$ or $g_a$ will ever be exactly zero, so this usually is not an issue. If this is an issue, then you'll have to explicitly define $f_a(0) = 1$ or $0$ (respectively, $g_a(0) = -1$ or $1$) depending on your application.
Below, I've plotted $f_a(x)$ in red and $g_a(x)$ in green, for $a = 3$, so you can get a sense of the behavior of these functions.