We had the following solved problem in our lecture which I didn't quite fully understand.
Find the partial derivatives of the function $f:\mathbb R^n\rightarrow \mathbb R$ $$f(x)=||x||^{\alpha}$$ outside of $(0,0)$ when $\alpha \in \mathbb R$. With what constant values $\alpha$ do the partial derivatives also exist at $(0,0)$?
With Chain Rule the partial derivatives of $f$ are $$\partial _if(x)=\alpha||x||^{\alpha-1}\times D||x||.$$
Since $$D||x||=D\left(\sqrt{x^2_1+...+x^2_n}\right)=\frac{1}{2}\times 2x_i(x^2_1+...+x^2_n)^{-\frac{1}{2}}=x_i(x^2_1+...+x^2_n)^{-\frac{1}{2}}=||x||^{-1}$$ we can write $$\partial _if(x)=\alpha||x||^{\alpha-1}\times D||x||=\alpha||x||^{\alpha-1}\times ||x||^{-1}=\alpha||x||^{\alpha-2}$$ when $x\neq 0$ and $i\le n$.
When $\alpha -2\ge0 \Rightarrow \alpha \ge 2$ and we set $0^0=1$ and $\alpha\gt 2$ if $0^0\neq 1$, the partial derivative exists at $(0,0)$.
My questions:
How did we get $$x_i(x^2_1+...+x^2_n)^{-\frac{1}{2}}=||x||^{-1}$$ since isn't $$x_i(x^2_1+...+x^2_n)^{-\frac{1}{2}}=x_i\times \frac{1}{\sqrt{x^2_1+...+x^2_n}}=x_i \times ||x||^{-1}?$$
When we wrote out $$\partial _if(x)=\alpha||x||^{\alpha-2},$$ why did we have to define the part "when $i\le n$"? Isn't $i\le n$ anyways?
As for the last part of the solution "When $\alpha -2\ge0 \Rightarrow \alpha \ge 2$ and we set $0^0=1$ and $\alpha\gt 2$ if $0^0\neq 1$, the partial derivative exists at $(0,0)$", how did we figure out that the partial derivative doesn't exist when $a\lt 2$? Doesn't $f$ approach $0$ when $a\lt 2$? And how come it suddenly exists when we set out certain rules?