My roommates and I have an argument you guys can help to settle (peace is at stake, don't let us down!) In undergrad calculus courses, one usually explains what it means for a function to be differentiable at a point x, and then differentiable in a domain. Then the focus is entirely on this latter notion. My question is:
Has the notion of differentiability at a point any interest?
That is, I'm looking for a theorem which is valid for a function regular at some point, but which needs significantly less regularity in a neighborhood of this point, or a good reason for which such a theorem doesn't exist.
Of course, this question is very flexible, and any insight is welcome.
Differentiability at a point is useful when you want to formulate a theorem like the fundamental theorem of calculus in the Lebesgue setting. For instance a continuous nondecreasing $f:[a,b]\to\mathbb{R}$ that satisfies some nice properties (maps sets of measure 0 to sets of measure 0 and is absolutely continuous) will be differentiable almost everywhere and satisfy the familiar $f(b) - f(a) = \int_a^b f'(x)d\mu(x)$ where $\mu$ is the Lebesgue measure.
Here of course $f'(x)$ really means any function which is equal to $f'(x)$ whenever $f'$ exists.
Often functions will be differentiable almost everywhere or, you'll only need to consider functions differentiable almost everywhere, and in this case differentiability at a point is inherent in such a definition.
Of course, sets of measure $0$ are small in some sense but such sets can still be dense so you can still easily have functions whose points of nondifferentiability are lurking in any interval about a point, although to construct such weird functions isn't trivial.