Suppose that $f:\mathbb{R}^2\to\mathbb{R}$ satisfies $f(0,0)=0$ and $f(t)=o(\|t\|^2)$ , $t\to (0,0)$.
If the first and second partial derivatives all exist, is it true that $$\frac{\partial f}{\partial y}(t) = o (\|t\|)\, , \, t\to (0,0)$$ $$\frac{\partial f}{\partial x}(t) = o (\|t\|)\, , \, t\to (0,0)$$ ?
My attempt
Put $\phi(s)=f(su)$ where $u=(1,0)$. Then we have $$\phi(s)=\phi(0)+\phi'(0) s + \frac12 \phi''(0)s^2 +o(s^2)\, , \, s\to 0$$ Via $f(t)=o(\|t\|^2)$ , $t\to (0,0)$ we have $\phi(s)=o(s^2)$ and hence $\phi(0)=\phi'(0)=\phi''(0)=0$.
Thus $$\frac{\partial f}{\partial x}(su)=o(s)\, , \, s\to 0$$
But I get confused whether we can say $$\frac{\partial f}{\partial x}(t) = o (\|t\|)\, , \, t\to (0,0)$$
Any hints? Thanks in advance!
Added
The reason why I asked this question is to try to find a possible method to solve the following problem.
Suppose that $f:\mathbb{R}^n\to\mathbb{R}$ with $f(0)=0$ satisfying $$f(x)=o(\|x\|^m)\,,\,x\to0$$ If all the $m$-order partial derivatives exist, prove that the partial derivatives equal $0$ at the origin (or give a counterexample).
Actually based on the answer that was put forward, it is certain that this method does not work.
I also want to ask for some hints for the new posted question. Thanks in advance!
Let us try to consider a simpler problem : Let $f : \mathbb R \to \mathbb R$ be differentiable everywhere, and $f(t)= o(t^2)$ near $t = 0$. Then, is $f'(t) = o(|t|)$ near $t = 0$?
The answer to this question in words is the following : even though $f(t) = o(t^2)$, the problem is that this does not place a restriction on the quantity $f(x+\delta) - f(x)$ for $|x|,|\delta|$ small . Let us examine this. Start with a $c > 0$, for which we are to find $l$ such that $|t| < l$ implies $f'(t) < c|t|$. We know from the conditions that for this $c$ we have $l'$ such that $|t|,|t+h| < l'$ implies $$ \frac{|f(x+h) - f(x)|}{|h|} \leq \frac{c|x+h|^2 + c|x|^2}{|h|} = c\frac{|x+h|^2 + |x|^2}{|h|} $$
and the problem with the latter quantity, is that you cannot drag $x \to 0$ or $h \to 0$ without the right hand side becoming potentially unbounded or out of control.
For example, let $g(t) = t^3 \sin(t^{-1})$ with $g(0) = 0$. Then $|g(t)| \leq t^3$ and near $0$ we have $t^3 \in o(t^2)$, so $g$ satisfies the given conditions. Furthermore, $g$ is differentiable everywhere(including at zero, as one can check easily), and $g'(t) = t[\cos t^{-1} + 3t \sin t^{-1}]$, which one sees is not $o(t)$ near zero, since in fact for $t= \frac 1{n\pi}$, with $n \to \infty$ we have $g'(t) = t$, so we see that for any $c < 1$ in the definition of $o(\cdot)$ we will get a contradiction via this sequence.
Now, for the two variable case something similar holds. Indeed, let $f(x,y) =g(x)g(y)$ so that $f(x,y) = x^3y^3\sin x^{-1} \sin y^{-1}$. Then it is not difficult to see that the partial derivatives exist everywhere. Next, we see that $f(x,y) \in o(|(x,y)|^2)$, this can be verified easily, since $|f(x,y)| \leq |x|^3|y|^3$ which is in $o(|(x,y)^2|)$. However, the partial derivative, for example w.r.t $x$ is $y^3 \sin y^{-1} g'(x)$ , and we know that this contains a term $x \cos \frac 1x y \sin y^{-1}$, which will not let the partial derivative be $o(|(x,y)|)$. I am leaving the lighter details to you.