I'm looking for equivalent definitions for the differentiability in general . I'm really confused about the differentiation of multivariable functions . Also I want the proofs for equivalency of the definitions .
Here is the definition in the Stewart calculus but it doesn't make sense to me :

There are plenty of definitions for differentiability. For example, within the context of a $\mathbb{R}^n$, we can suggest the definition to be:
A function $f: \mathbb{R}^n \rightarrow \mathbb{R}^m$ is differentiable at $x \in \mathbb{R}^n$ provided there exists a linear mapping $A$ such that the limit $$\lim_{\epsilon \rightarrow 0, \ \ h \in B_{\epsilon}(x)} \frac{|| f(x + h) - (f(x) + h A(x)) ||}{||h||} = 0 $$
Clearly, in the context you are looking for, $n = 2, m = 1$.
It becomes a little complicated to prove equivalence however, simply by Analysis and it's ability to fit within a variety of different contexts and applications. An immediate question that is often discussed after the definition I wrote above would be how it relates to the "differential", which seems to be the approach to a derivative the excerpt you included implies. So, firstly, we would consider the differential of this function to simply be: $$df_x(h) = A(x) \cdot h$$ If we express $x = \pmatrix{a \\ b}$, $A(x) = \pmatrix{f_x(a,b) \\ f_y(a,b)}$ and $h = \pmatrix{\Delta x \\ \Delta y}$, then we see that: $$df_x(h) = f_x(a,b) \Delta x + f_y(a,b) \Delta y$$ Now giving ourselves a room of epsilon, we see that: $$\Delta z = \Delta f_x(h) = f_x(a,b) \Delta x + f_y(a,b) \Delta y + \Delta x \cdot \epsilon_1 + \Delta y \cdot \epsilon_2$$ where we ensure $\sqrt{\epsilon_1^2 + \epsilon_2^2} < \epsilon$ in the limit statement. So you see this statement is the same as the one you provided. It is rather specific, which is indivative of the source Stewart Calculus.