like
$$ \operatorname{f^o}(x) = \lim_{h\to 1} \frac{f(x*h)}{f(x)} $$
$$ \operatorname{f}(x)=e^x $$
$$ \operatorname{f^o}(x) = \lim_{h \to 1} e^{x*h}/e^{x} = \lim_{h\to 1} e^{x*h-x}=e^0=1 $$
does it have a name and some theories behind it? if not why is this alt-derivative not used?
it would be nice if someone can point me in the right direction since I'm unsure what to search for without a name.
the application is if this could be used on a special machine learning approach instead of a gradient descent based on normal derives.
I'm only interested in purely positive functions unequal zero to apply this on
edit the correct formular: from @B.Martin book reference in the comments:
$$ lim_{x\to a} \frac{\operatorname{f}(x)}{\operatorname{f}(a)}^{\frac{1}{ln(x)-ln(a)}} $$
wrong approach, thanks for the comments was stuck on my idea there :)