Proving/disproving a sufficient condition for Gâteaux differentiability

90 Views Asked by At

Let $X,Y$ be Banach spaces. A function $f :X \to Y$ is said to be Gâteaux differentiable at $x$ if there exists a bounded linear operator $A : X \to Y$ such that $$\lim_{r \to 0}\frac{\|f(x+rh)-f(x)-rAh\|}{r}=0\tag{1}$$ for every $h \in X.$

I'm trying to prove/disprove whether it is sufficient to take $r \in \mathbb R$ in $(1).$

I tried the following example:

Let $f : \mathbb C \to \mathbb C$ be defined as $f(z)=\overline{z}.$ Define $A: \mathbb C \to \mathbb C$ as $A(h)=\overline{h}.$ Then $A$ is a bounded linear operator.

Suppose $r \in \mathbb{R}.$ Then, $$\frac{\|f(z+rh)-f(z)-rAh\|}{r}=\frac{\|\overline{z+rh}-\overline{z}-r\overline{h}\|}{r}=0$$

But if $r \in \mathbb C,$ then $$\frac{\|f(z+rh)-f(z)-rAh\|}{r}=\frac{\|\overline{rh}-r\overline{h}\|}{r}=\frac{2|\overline{h}\Im{r}|}{r}\not\to 0$$ as $r \to 0.$ (where $\Im{r}$ is the imaginary part of $r.$)

Therefore, it is not sufficient to take $r \in \mathbb R$ in $(1).$

Edit: As pointed out by Daniel Fischer in the comments, my above example was wrong. I would like to know whether it is sufficient to take $r \in \mathbb R$ in $(1).$ I have tried several other examples too but I couldn't construct one that succesfully disproved the claim.