Example of 1-differentiable function with symmetric Hessian for which the second order Taylor expansion fails

62 Views Asked by At

A function $f:\mathbb{R}^N\to\mathbb{R}$ is twice-differentiable at a point $x_0$ if it is differentiable in a neighborhood $B_\delta(x_0)$ of $x_0$ and its first differential $\mathrm{d}f(x):B_\delta(x_0)\to L(\mathbb{R}^N; \mathbb{R})$ is differentiable at $x_0$. From my understanding this is equivalent to say, that $f$ is twice-differentiable at $x_0$ if and only if it is differentiable in a neighborhood of $x_0$ and there exists a quadratic form $q:\mathbb{R}^N\to \mathbb{R}$ such that

\begin{equation}\tag{1}\label{a} f(x_0 +h) - f(x_0) - \mathrm{d}f(x_0)(h) - q(h) = o(\|h\|^2), \end{equation}

that is, you have a Taylor expansion of order two. A posteriori, you discover that the quadratic form acts as

$$ q(h) = \frac 12 hHf(x_0)h^T. $$

Pretty much in any math textbook, you have examples of functions whose Hessian is not symmetric, and hence they can not be twice-differentiable. What about the other option, that is, functions with symmetric Hessian that do not admit a second order Taylor expansion, that is, the term on the LHS of \eqref{a} is a $O(\|h\|^2)$.

I tried with assuming that $\partial_x f(x,y)$ is one of the usual functions that admits gradient but is not differentiable, and then integrated it with respect to $x$ to get $f(x,y)$, mod some function $g(y)$. I carried out the computations with some of those classic examples, but didn't get where I wanted to.

Anyone with an explicit example?

1

There are 1 best solutions below

0
On BEST ANSWER

Short answer: $f= r^2 \sin^3 (4 \theta)$ works.

Explanation. Begin by constructing a smooth function on the unit circle that satisfies $0= g(\theta)= g'(\theta)$ at each $\theta$ that is a multiple of $\pi/2$. Then consider the function $f(x,y)= (x^2+y^2) g(\theta)$ that is homogeneous of degree two.

  1. The second derivatives exist at the origin and the mixed derivatives agree.

Note that $f$ is flat when restricted to each coordinate axis.

Then one checks easily that at the origin $f_{xx}=0 =f_{yy}$.

What about the mixed derivatives? In general $f_x= (2x) g+ (x^2+y^2)g' \theta_x$ where $\theta_x= \frac{-y}{x^2+ y^2}$. At points on the $y$ axis, $\theta_x= \frac{-1}{y}$. Thus to compute $(f_x)_y $ as a limit evaluated along the $y$ axis by first setting $x=0$, one must examine the limit of the difference quotient $\frac{f_x}{y}= \frac{-yg'(\theta)}{y} =-g'(\theta)$. The limit exists and is zero.

Similar we can check that the mixed derivative $(f_y)_x$ taken in the other order exists and vanishes at the origin.

  1. The function $f$ is not well-approximated by any quadratic. Note that a quadratic $Q(x,y) =a x^2 + bxy + cy^2$ can be can change sign at most 4 times as we vary $\theta$. (In fact there are no sign changes unless the discriminant is negative, in which case there are exactly four changes of sign.)

The function $f$ cannot be approximated well by a quadratic $Q(x,y)$ if we choose $g$ to oscillate and change sign more than $4$ times. Why? If $Q$ is not identically zero, the error between the quadratic $Q$ and $g$, evaluated along a ray where $g$ is zero but $\phi$ is not zero, does not vanish to order $o(r^2)$. And if $Q$ is identically zero, the fact that $g$ is nonzero somewhere also precludes the possibility that the error is $o(r^2)$.