I am trying to prove that $f:\mathbb{R}^2\rightarrow\mathbb{R}^2$, $$f(x)=x\|x\|$$ is differentiable as a part of a larger task.
I think there two ways to approach this:
By proving that for every $x\in\mathbb{R}^2$ there is a linear function $A$ for which $$f(x+h)=f(x)+Ah+\|h\|\epsilon(h)$$ where $\epsilon(h)\rightarrow0$ when $h\rightarrow0$
By proving that all of the first order partial derivatives of $f$ exist and are continuous.
For the sake of my own understanding, I would like to know how to prove this with both of the ways. Here are my attempts so far:
- $f(x+h)=(x_{1}\sqrt{x_{1}^2+x_{2}^2}+h_{1},x_{2}\sqrt{x_{1}^2+x_{2}^2}+h_{2})=(x_{1}\sqrt{x_{1}^2+x_{2}^2},x_{2}\sqrt{x_{1}^2+x_{2}^2})+(h_{1},h_{2})=f(x)+(h_{1},h_{2})$
I don't know how to go on with this since I'm not sure how I'm supposed to choose $A$. If $f$ is differentiable, $A$ should be $Df(x)$ but I don't know how should I manipulate the expression to achieve that.
- For the second way, I'm not sure how to write out the general form of the first order partial derivatives for $f$.
Part 1. Here is a derivation (no pun intended) of $f'(x)(h)$ from first principles. It is valid not just in $\mathbb{R}^2$, but in any real inner product space $E$, not necessarily even finite-dimensional. ($E$ is not even assumed to be complete; but if it isn't, then I don't think one is allowed to speak of $f$ being "differentiable" at $x$.)
First, we make some estimates:
(i) By the Triangle Inequality, $\lvert\lVert x + h \rVert - \lVert x \rVert\rvert \leqslant \lVert h \rVert$.
(ii) If $x \ne 0$, then by (i), as $h \to 0$, $$ \left\lvert\frac{2}{\lVert x + h \rVert + \lVert x \rVert} - \frac{1}{\lVert x \rVert}\right\rvert = \frac{\lvert\lVert x \rVert - \lVert x + h \rVert\rvert} {(\lVert x + h \rVert + \lVert x \rVert)\lVert x \rVert} \leqslant \frac{\lVert h \rVert}{\lVert x \rVert^2} = O(\lVert h \rVert). $$
(iii) The Cauchy-Schwarz inequality $\lvert\left\langle x, h \right\rangle\rvert \leqslant \lVert x \rVert \lVert h \rVert$ gives $\left\langle x, h \right\rangle = O(\lVert h \rVert)$.
Now, for all $x, h \in E$, \begin{align*} f(x + h) - f(x) & = \lVert x + h \rVert(x + h) - \lVert x \rVert x \\ & = \lVert x \rVert h + (\lVert x + h \rVert - \lVert x \rVert)(x + h) \\ & = \lVert x \rVert h + (\lVert x + h \rVert - \lVert x \rVert)x + O(\lVert h \rVert^2), && \text{by (i).} \end{align*} This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now on, we assume that $x \ne 0$. \begin{gather*} f(x + h) - f(x) - \lVert x \rVert h = \frac{\lVert x + h \rVert^2 - \lVert x \rVert^2} {\lVert x + h \rVert + \lVert x \rVert}x + O(\lVert h \rVert^2) \\ = \frac{2\left\langle x, h \right\rangle + \lVert h \rVert^2} {\lVert x + h \rVert + \lVert x \rVert}x + O(\lVert h \rVert^2) = \frac{2}{\lVert x + h \rVert + \lVert x \rVert} \left\langle x, h \right\rangle x + O(\lVert h \rVert^2) \\ = \frac{\left\langle x, h \right\rangle}{\lVert x \rVert}x + \left(\frac{2}{\lVert x + h \rVert + \lVert x \rVert} - \frac{1}{\lVert x \rVert}\right) \left\langle x, h \right\rangle x + O(\lVert h \rVert^2). \end{gather*} Therefore, by (ii) and (iii), $$ f(x + h) = f(x) + \lVert x \rVert h + \frac{\left\langle x, h \right\rangle}{\lVert x \rVert}x + O(\lVert h \rVert^2). $$
This agrees with the formula for $f'(x)(h)$ in my earlier brief comment. (The main result used there - apart from the Chain Rule, and the formula for the derivative of the square root function on $\mathbb{R}_{>0}$ - is Frechet derivative for bilinear map.)
Part 2. For simplicity [but at some risk of confusion with the earlier use of the symbol '$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write $$ (u, v) = f(x, y) = r(x, y) = (rx, ry), \text{ where } r = \sqrt{x^2 + y^2}. $$
The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that answer has now been deleted, I'll go over the same ground here.
We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so \begin{align*} |u(h, 0)| & = h^2, \ v(h, 0) = 0, \\ |v(0, k)| & = k^2, \ u(0, k) = 0, \end{align*} showing that the partial derivatives $D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.
Assume now that $(x, y) \ne (0, 0)$. Then $r > 0$, and $\partial r/\partial x = x/r$, $\partial r/\partial y = y/r$, whence $$ \begin{pmatrix} \frac{\partial u}{\partial x} & \frac{\partial u}{\partial y} \\ \frac{\partial v}{\partial x} & \frac{\partial v}{\partial y} \end{pmatrix} \begin{pmatrix} h \\ k \end{pmatrix} = \begin{pmatrix} \frac{x^2}{r} + r & \frac{xy}{r} \\ \frac{xy}{r} & \frac{y^2}{r} + r \end{pmatrix} \begin{pmatrix} h \\ k \end{pmatrix} = r \begin{pmatrix} h \\ k \end{pmatrix} + \frac{xh + yk}{r} \begin{pmatrix} x \\ y \end{pmatrix}, $$ in agreement with the previous result.
In a convenient but admittedly loose notation, simply denoting the separate convergence of all four matrix entries, $$ \lim_{(x, y) \to (0, 0)} \begin{pmatrix} \frac{\partial u}{\partial x} & \frac{\partial u}{\partial y} \\ \frac{\partial v}{\partial x} & \frac{\partial v}{\partial y} \end{pmatrix} = \lim_{(x, y) \to (0, 0)} \begin{pmatrix} \frac{x^2}{r} + r & \frac{xy}{r} \\ \frac{xy}{r} & \frac{y^2}{r} + r \end{pmatrix} = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}, $$ showing that all four partial derivatives are continuous everywhere. $\square$