There is a proof in my math book about how:
$g'(f(x))=\frac{1}{f'(x)}$ where $g(x)=f^{-1}$
The author did this by using the definition of derivative in the following way:
Let $k(h)=g(f(x)+h)-g(f(x))$
This means that:
$g'(f(x))=\lim_{h\to 0} \frac{g(f(x) + h) - g(f(x))}{h} = \lim_{h\to 0} \frac{k(h)}{f(x+k(h))-f(x)} = \lim_{k\to 0} \frac{k}{f(x+k)-f(x)} =\frac{1}{f'(x)}$
It might be really simple, but I dont see how $\lim_{h\to 0} \frac{g(f(x) + h) - g(f(x))}{h}$ was rewritten to $\lim_{h\to 0} \frac{k(h)}{f(x+k(h))-f(x)}$
$$h=f(x+k(h))-f(x)\impliedby$$ $$f(x)+h=f(x+k(h))\impliedby$$ $$g(f(x)+h)=x+k(h)\impliedby$$ $$g(f(x)+h)-x=k(h).$$