Working through the book, First Course in Calculus, and the book shows 2 proofs for the chain rule; the first assuming that $k\neq0$ and letting $h$ tend to $0$
in the following:
$$\frac{f(u+k)-f(u)}{k}\frac{k}{h}$$
Where $u=g(x)$ and $k=g(x+h)-g(x)$
and the second that can deal with the case for when $k=0$. But my question is, when can k ever equal 0?
My confusion lies in the fact that in order for us to take the derivative of the composite function $f(g(x))$ (assuming that both $f$ and $g$ have derivatives at $x$ and that $f$ is defined for all values of $g$), the small difference $h$ has to be not equal to 0. And so the only time when $k=0$, is when $g(x+h)=g(x)$. But that would imply that $h=0$ and the entire idea of limits being 'around a point' gets thrown out completely.
So, that would mean in order for us to still take the derivative of the composite function $f(g(x))$,
$h\neq0$
$\therefore k\neq0$
and the first proof given is sufficient.
In summary, I'm curious about what cases/examples does $k=0?$
To quote from the book:
It does not happen very often that $k=0$ for arbitrarily small values of h, but when it does happen, the preceding argument must be defined.
Consider any constant function $g(x)$ , then $k=0$.
Consider $g(x)$ to be a function like $x^2\sin (1/x)$ for $x$ not equal to $0$ and $0$ at $x=0$ (which is differentiable) but the proof won’t work for it at $x=0$ because there will exist infinitely many $h$ for which $k=0$ (In any interval of $0$.)
So proving the chain rule separately for constant $g(x)$ will also fail for this second example .