Let $f:\mathbb{R}^n\rightarrow \mathbb{R}$ be a function such that $$\lim\limits_{\varepsilon\rightarrow 0^+}\dfrac{f(x+\varepsilon y)-f(x)}{\varepsilon}=b+a\cdot y$$ $\forall y\in\mathbb{R}^n$. Show that $b=0$.
I tried this problem, but all that I have untill now is that $$\lim\limits_{\varepsilon\rightarrow 0^+}\dfrac{f(x+\varepsilon (-y))-f(x)}{\varepsilon}=-\lim\limits_{\varepsilon\rightarrow 0^-}\dfrac{f(x+\varepsilon y)-f(x)}{\varepsilon}$$
Is about why the Gateaux derivative is of the form $f'(x;y)=a\cdot y$ and not $f'(x;y)=b+a\cdot y$
Let $y=0$. Then:
$$\lim\limits_{\epsilon\to 0} \frac{f(x+\epsilon y)-f(x)}{\epsilon}=0=b+ay=b+a0=b$$