Some days back I was doing the lebsegue integration.
I was amazed by the integration's ability to maximize the potential of Reinman integration.
Are there any new differentiation (except the metric differential, which only preserves the distance, not the direction) which extends the concept of the general differentiation ?
Mathematics is invented by fiddling with old systems and defining them as "new." As you can imagine, this method results in thousands of definitions and millions of theorems. Differentiation is no different. My favorites are discrete and fractional differentiation.
$\textbf{Discrete calculus}$: motivated by restricting the domain of functions to the integers or some subset of the integers, i.e. functions are in the form $f: \mathbb{Z} \to \mathbb{R}$, or whatever target space you choose. In light of this, the concept of $\varepsilon$ and $\delta$ in traditional real analysis is meaningless. Nonetheless, mathematicians try to naturally extend definitions as much as possible. The discrete derivative at $x = c \in \mathbb{Z}$ is defined as: $$\Delta f(c) = f(c+1) - f(c).$$ Compare to the derivative of $g : \mathbb{R} \to \mathbb{R}$ at $x = c$: $$\frac{\mathrm{d}g}{\mathrm{d}x} = \lim_{h \to 0} \, \frac{g(c+h) - g(c)}{h}.$$ With some imagination, you can see that the definitions are not too different. As mentioned before, the concept of $\delta$ is gone, yet we still want the limit to take values from 'arbitrarily' near values in the domain. In the integers we have no choice but to jump forward or backwards by integer amounts. Naturally the choice of $h$ would be either $1$ or $-1$, and the inventor chose $h = 1$. In the traditional definition, this would cancel the denominator and leave the numerator looking exactly like $\Delta f(x)$. The entire field of discrete calculus is based on this pseudo-limiting process, and I encourage you to look into this further for some light (or heavy) reading. What could be more strange than a world where every function is continuous?
$\textbf{Fractional calculus}$: let's go back to $f: \mathbb{R} \to \mathbb{R}$ or some open subsets of $\mathbb{R}$. Recall in single variable calculus you learned to differentiate once to find extrema, differentiate twice to test extrema, and more and more for computational practice. The number of times you differentiated was a natural number. French mathematician Liouville questioned why that had to be the case. Who is stopping you from differentiating a fraction number of times? Can we make sense of $D^{m/n} f(x)$? With some new definitions, it turns out you can. As you would expect, taking two half derivatives leads to the same result as taking the derivative once, so some integrity is maintained (which is the goal of mathematicians). The groundwork is a bit much to type out, but the Wiki for it is great for an introduction.
Discrete and fractional calculus are not encountered in standard coursework (which is why I chose to write on it), but perhaps you will see complex differentiation in a complex analysis course. By the end of it all, you will find that the single variable definition is boring and new systems are more fun.