A proof of the chain rule goes as follows:
Let $f=u\circ v$, so that $f(x)=u(v(x))$, $x\in \mathbb R$, and $v$ be differentiable at $x$ and $u$ be differentiable at $v(x)$. Then, by definition, the derivative of $f$ at $x$ is:
$$f'(x)=\lim_{h\to0} \frac{f(x+h)-f(x)}{h}=\lim_{h\to0}\frac{u(v(x+h))-u(v(x))}{h}$$
Let $k=v(x+h)-v(x)$, so that $v(x+h)=v(x)+k$. Thus:
$$f'(x)=\lim_{h\to0}\frac{u(v(x)+k)-u(v(x))}{h}$$
Define:
$$g(t)= \left\{ \begin{array}{c} \dfrac{u(v(x)+t)-u(v(x))}{t}, t\neq 0\\ u'(v(x)), t=0 \end{array} \right. $$
This way, the following equality holds:
$$\frac{f(x+h)-f(x)}{h}=g(k)\cdot \frac{k}{h}$$
for any $h\neq 0$. Thus:
$$f'(x)=\lim_{h\to0} \frac{f(x+h)-f(x)}{h}=\lim_{h\to0}g(k)\cdot \frac{k}{h}=\\ =\lim_{h\to0}g(k)\cdot \frac{v(x+h)-v(x)}{h}$$
If $\lim_{h\to0}g(x)$ exists and since $v$ is differentiable at $x$:
$$f'(x)=\lim_{h\to0}g(k)\cdot \lim_{h\to0}\frac{v(x+h)-v(x)}{h}=u'(v(x))\cdot v'(x)$$
My issue with this is how any arbitrary value could have been assigned to $g$ at $0$ without making $\frac{f(x+h)-f(x)}{h}=g(k)\cdot \frac{k}{h}$ any less valid. Yet, the subsequent steps only hold if $g$ is specifically defined to be $u'(v(x))$ at $0$. Otherwise, there is no guarantee that the limit of $g$ exists for $h\to 0$.
I can't really point to exactly where the issue would be, but at the same time the process makes me feel uneasy, as if it were some arbitrary "trickery". It seems like it shouldn't work. I'd be glad if anyone could shed some light on it. Thanks.