I was wondering what the significance of a function that gives the slope of $y=|x|$ at any $x$ is. If$$f(x)=|x|$$then we could do, as the derivative:$$\frac{d f}{d x}=\frac{x}{|x|}$$or$$\frac{d f}{d x}=\frac{|x|}{x}$$This would give us the slope at any $x$. What is wrong with my approach? How would this connect to limits?
A Pseudo-Derivative for $f(x)=|x|$
131 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
The function is differentiable on $\mathbb{R}$ except on $0$.
Its derivative is $-1$ for $x < 0$, $+1$ for $x > 0$.
In $0$, the function is semi-differentiable, i.e. it has a left derivative (-1), a right derivative (1), but no derivative.
https://en.wikipedia.org/wiki/Semi-differentiability
The function being linear (with a different factor on negatives and on positives), its derivative is equal to $f(x)/x$ for $x \ne 0$. But that's a specific case.
On
A nice way to handle this would be to utilise subderivatives. They are meant to tackle problems like these. They are meant to be used for cases where the "normal" derivative does not exist (like the absolute value). To do this, first we note that the subgradient of $f$ can be defined. Then we take the definition: \begin{align} m(x-x_0)\leq f(x)-f(x_0)\\ &\implies mx \leq |x|\\ &\implies \begin{cases}\text{for } x>0 \text{: } m\leq 1\\ \text{for } x<0 \text{: } -m\leq 1\\ \text{for } x=0 \text{: } 0 \leq 0\\ \end{cases}\\ &\implies m \in [-1,1]. \end{align}
Thus the subgradient for $x_0 = 0$ is $[-1,1]$. However, I would not call it "pseudo" derivative since in Convex Analysis subgradients like these are very very useful in solving a lot of problems, particularly three-dimensional function maximizers.
The derivative of $|x|$ is defined for all $x \neq 0$, and it is the "sign" (or "signum") function:
$$\frac{d}{dx}|x| = \textrm{sgn}(x) = \begin{cases} 1 & x > 0 \\ -1 & x < 0 \end{cases}$$
There are alternative derivative-like operations that will give results for non-differentiable functions, and one is the symmetric derivative. The symmetric derivative of $f(x)$ is given by
$$\lim_{h \rightarrow 0} \frac{f(x + h) - f(x - h)}{2h}$$ - notice that it looks at the gradient of secants that are defined symmetrically about the point $x$. If $f$ is differentiable at $x$, then its symmetric derivative is exactly equal to its normal derivative, but you can also take the symmetric derivative of, for example, $|x|$ at a non-differentiable point like $x = 0$:
$\begin{eqnarray}\lim_{h \rightarrow 0} \frac{f(0 + h) - f(0 - h)}{2h} & = & \lim_{h \rightarrow 0} \frac{|h| - |-h|}{2h} \\ & = & \frac{|h| - |h|}{2h} \\ & = & 0\end{eqnarray}$
This is all fine, and it's actually useful in some circumstances, but as so often happens when you try to introduce something to fill an apparent gap, something else breaks - in this case, the Mean Value Theorem no longer holds. Normally, if $f$ is differentiable on an interval then you can find a tangent at some point that is parallel to the secant joining the endpoints of that interval, but if we tried that here on, say, the interval $[-1, 2]$ then we'd find that there should be a point where the tangent to $|x|$ has gradient $\frac{|2| - |-1|}{2 - (-1)} = \frac{1}{3}$, but we already know that the derivatives can only be from $\{-1, 0, 1\}$.