Why does the concept of differentiation have the meaning only on open sets?

1.9k Views Asked by At

I am just stuck on the same problem asked in the following question. The book asks to implicitly differentiate two equations. Then it turns out the answer is "neither of them is implicitly differentiable". Alex M. in the previous question answers saying:

... $y = \pm x$ but this equality does not define on open set, only a finite set of points, and the concept of derivability makes no sense on such sets.

Can someone explain what is going on?

5

There are 5 best solutions below

5
On

In order to define the derivative of $f$ at a point $a$ you have to be able to find values $f(x)$ for all $x$ near $a$ in order to form the difference quotient whose limit is the derivative.

If whenever $a$ is in the domain all $x$ near $a$ are in the domain then the domain is open.

0
On

To add onto Bolker's answer, the fact that you need values of $f(x)$ for all $x$ near $a$ can be seen from the definition of a derivative: $f'(x)=\lim_{h \to 0} \frac{f(a+h)-f(a)}{h}$. When we say $h \to 0$, that means $h$ can approach $0$ from either the positive or negative side. So we need $f$ to be defined on an open set containing $a$.

1
On

In my opinion this is a philosophical question. The "fathers" of the concept of differentiation are Newton and Leibniz, and their primary intention was to analyze physical motion.

If you describe a moving object mathematically, then you get some function $f$ defined on a real interval $J$ corresponding to the linear flow of time with range $\mathbb{R}$, $\mathbb{R}^2$ or $\mathbb{R}^3$ (depending on which aspects you are interested in) corresponding to space. If the object moves at time $t_0$ through a point $s_0$, then there clearly exists an open interval $(t_1,t_2) \subset J$ which contains $t_0$.

Finding the speed at time $t_0$ means to compute the "usual" limit value of average speeds close to $t_0$. This gives the standard definition of $f'(t_0)$ for a function defined on an open interval.

However, to define $f'(t_0)$ it would be sufficient to assume that $t_0$ is a cluster point of $J \backslash \lbrace t_0 \rbrace$. The case that $t_0$ is an isolated point of $J$ doesn't make much sense - there is no limit.

In fact, this position is adopted in practical applications. It is standard to work with functions $f : J \to \mathbb{R}$ defined on non-open intervals, and without any concerns we consider the one-sided derivatives in the boundary points of $J$. If e.g. $J = [a,b)$, then we consider the right derivative in $a$ although $f$ is not defined on any open interval containing $a$. Okay, the situation here is rather nice because $(a,b)$ doesn't have "gaps", but to define the right derivative in $a$ we only need the fact that $a$ is a cluster point of $(a,b)$.

In multivariable calculus we deal with functions $f : U \to \mathbb{R}^m$ defined on open subsets $U \subset \mathbb{R}^n$. The derivative in a point $x_0 \in U$ is then defined as a linear map $Df(x_0) : \mathbb{R}^n \to \mathbb{R}^m$ such that $x \mapsto f(x_0) + Df(x_0)(x-x_0)$ is the "best approximation" to $f$ in $x_0$. Here it becomes more lucid why we require $U$ to be open: We need it to show that $Df(x_0)$ is uniquely determined. Again this assumption on $U$ could be weakened, but the price would be to use very technical (if not intransparent) conditions.

0
On

Without losing generality let us consider a function defined on a compact interval of the real line. If you recall the derivative of such a function at a point $p$ of its domain as specifying the best linear approximation to the function near $p$ (this derivative being the slope of the nonvertical tangent line to the graph of the function at $p$), then it is now easy to see that the derivative cannot exist at the endpoints of the interval. Therefore the concept of the derivative for functions on such intervals only makes sense when such endpoints are excluded from consideration a priori. Excluding the endpoints from consideration makes the interval open since each point in it is now an interior point -- that is, every neighborhood of each point contains another point in the interval.

It is easy (I hope) to see how this generalises to functions defined on arbitrary subsets of $\mathbf R$, and even more generally to a function defined on any set. Consider, for example, the domain of a function of a real variable which is a set of finitely many points, then it is differentiable nowhere; or if the domain has an isolated point, then it is not differentiable there, etc.

4
On

Consider any subset $A \subset \mathbb{R}$ and any function $f : A \to \mathbb{R}$.

There is a useful definition of differentiability in this context, which can be formulated after the ordinary definition of differentiability has already been formulated, in which the domain is required to be open. Namely: $f$ is differentiable at a point $x \in A$ if there exists an open set $U \subset \mathbb{R}$ containing $x$, and there exists a function $\hat f : U \to \mathbb{R}$, such that $\hat f$ is differentiable at $x$, and such that $\hat f\mid _{ U \cap A} = f\mid_{ U \cap A}$.

For example, here's an exercise which relates this concept to the general concept of "one-sided derivatives": If $A = [a,b]$ is a closed interval then $f$ is differentiable at $a$ (according to the above definition) if and only if the following one sided derivative "from above" exists: $$\lim_{h \to 0^+} \frac{f(a+h) - f(a)}{h} $$ Similarly, $f$ is differentiable at $b$ if the one sided derivative at $b$ "from below" exists.

The exact same definition works if you allow $A$ to be a subset of a higher dimensional Euclidean space, and similarly let the target of $f$ be a higher dimensional Euclidean space. One context is which this concept is used is to define differentiable functions on objects like polygons or polyhedra (which leads to the general and very useful topic of "manifolds with corners").