In a lecture notes, there is written:
Isaac Newton uses orthogonality properties to establish the principles of calculus. The definitions of derivative and integral for this author is based on geometric reasoning where orthogonality plays a major role.
But - for me - it is hard to find examples that reinforce these words. What do you think?
I believe that it all boils down to the idea of decomposing an infinitesimal section of a line into infinitesimal increments in two different directions. The concept of orthogonality allows us to establish that increments along $x$ axis and along $y$ axis are, generally speaking, independent. Therefore if they are changing simultaneously, there must be some kind of underlying law which establishes the relation between two non-interacting increment operations. This, in turn, leads to the concept of formalized function, and, as consequence, to the differential and integral calculus.
For example, let us take two non-orthogonal vectors as basis: $$ e_1 = \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \quad e_2 = \begin{bmatrix} 1 \\ 1 \end{bmatrix}. $$
While moving along the first basis vector $e_1$, you also change your position in terms of $e_2$. Therefore, if you define a function $f$ in terms of $e_1$ and $e_2$, e.g. $\,f = f\left(e_1,e_2\right)$, it will be much harder to track rate of change of function as the argument changes. In particular, while changing $e_1$, we will also involuntarily change $e_2$, and so the rate of change of $\,f$ will be impacted by the second basis vector.
If, however, we assume that $e_1$ and $e_2$ are orthogonal, we will have no problems isolating change in each coordinate from another. In this case it becomes easy to define derivative of $\,f$ with respect to a given vector from orthogonal set.