History question about the difference quotient: what does "h" mean?

301 Views Asked by At

Regarding the difference quotient (f(x+h) - f(x)/h), what is the origin of the usage of "h"? Does it mean anything?

Thank you!

1

There are 1 best solutions below

1
On

Isaac Newton (1671) used $o$ in his method of fluxions, meaning a infinitely small increment of the "independent variable" (implicitly: time).

G.W. Leibniz (1676) used the differentials : $dx, dy, dz$.

Augustin-Louis Cauchy, in his Cours (1821) uses $\alpha$ for "an infinitely small quantity".

But $h$ is used in Cauchy's Resumé (1823), page 7 :

Let $h$ and $i$ two distict quantities, the first one finite, and the second one infinitely small, and let $\alpha = \dfrac i h$ their infinitely small ratio. If we put $\Delta x$ as the finite value $h$, the value of the equation

$\Delta y = f(x+\Delta x)- f(x)$

will be called the finite difference of the function $f(x)$, and it will be an ordinary finite quantity.

See also Resumé, page 9 :

$\dfrac {\Delta y}{\Delta x} = \dfrac {f(x+i)-f(x)}{i}.$

I think that, as long as $i$ become the "standard" name for the imaginary unit, it left place to $h$ as the "standard" name for the increment.