Interchanging mixed derivatives: is $D_{h_2} (D_{h_1} f)(a) = D_{h_1} (D_{h_2} f)(a)$ if $D_{h_2} (D_{h_1} f)$ is continuous at $a$?

108 Views Asked by At

Given that $f : E \subset \mathbb{R}^n \to \mathbb{R}$ and $h_1,h_2 \in \mathbb{R}^n$ such that $D_{h_1} f$, $D_{h_2} f$ and $D_{h_2} (D_{h_1} f)$ exist in a small interval around $a$ and are continuous at $a$, prove that $D_{h_1}(D_{h_2} f)(a)$ exists and that it is equal to $D_{h_2} (D_{h_1} f)(a)$.

We have by definition that $$ \begin{align} D_{h_1} (D_{h_2} f)(a) &= \lim \limits_{s \to 0} \frac{D_{h_2}f(a+s h_1) -D_{h_2} f(a)}{s}\\ &= \lim \limits_{s \to 0} \lim \limits_{t \to 0} \frac{(f(a+s h_1+t h_2)-f(a+s h_1))-(f(a+t h_2)-f(a))}{s t}, \end{align} $$ but I am stuck here. I think Lagrange's Mean Value Theorem is useful here, but I don't know how, or that if I am allowed to interchange the limits.

1

There are 1 best solutions below

6
On BEST ANSWER

Here is a proof of the result in the case when $n = 2$, taken from Ghorpade and Limaye's A Course in Multivariable Calculus and Analysis.

We will need to use the so called Rectangular Mean Value Theorem (stated on page 93 of the textbook):

Proposition 3.11 (Rectangular Mean Value Theorem). Let $a,b,c,d \in \mathbb{R}$ with $a < b$ and $c < d$, and let $f : [a,b] \times [c,d] \to \mathbb{R}$ satisfy the following.

  • For each fixed $y_0 \in [c,d]$, the function given by $x \mapsto f(x,y_0)$ is continuous on $[a,b]$ and differentiable on $(a,b)$.
  • For each fixed $x_0 \in [a,b]$, the function given by $y \mapsto f(x_0,y)$ is continuous on $[c,d]$ and differentiable on $(c,d)$.

Then there is $(x_0,y_0) \in [a,b] \times [c,d]$, such that $$ f(b,d) + f(a,c) - f(b,c) - f(a,d) = (b-a)(d-c)f_{xy}(x_0,y_0). $$

In the statement and proof of the mixed partials theorem, $f_x \equiv D_1 f$, $f_y \equiv D_2 f$, $f_{xy} \equiv D_2 D_1 f$ and $f_{yx} \equiv D_1 D_2 f$. The proposition is stated on page 94.

Proposition 3.14 (Mixed Partials Theorem). Let $D \subseteq \mathbb{R}^2$ be an open set and let $(x_0,y_0)$ be any point of $D$. Let $f : D \to \mathbb{R}$ be such that both $f_x$ and $f_y$ exist on $D$. If $f_{xy}$ or $f_{yx}$ exists on $D$ and is continuous at $(x_0,y_0)$, then both $f_{xy}(x_0,y_0)$ and $f_{yx}(x_0,y_0)$ exist and $$ f_{xy}(x_0, y_0) = f_{yx}(x_0, y_0). $$

Proof. Assume that $f_{xy}$ exists on $D$ and is continuous at $(x_0,y_0)$. Let $\epsilon > 0$ be given. Since $D$ is an open subset of $\mathbb{R}^2$ and $f_{xy}$ is continuous at $(x_0,y_0)$, there is a $\delta > 0$ such that $B_\delta(x_0,y_0) \subseteq D$ and $$ (u,v) \in B_\delta(x_0,y_0) \implies |f_{xy}(u,v) - f_{xy}(x_0,y_0)| < \epsilon. $$ Fix $(h,k) \in B_\delta(0,0)$ with $h \neq 0$ and $k \neq 0$. By the Rectangular Mean Value Theorem, there is $(c,d) \in B_\delta(x_0,y_0)$ such that $$ f(x_0 +h, y_0 + k) - f(x_0 + h,y_0) - f(x_0,y_0 + k) + f(x_0,y_0) = hk f_{xy}(c,d). $$ The LHS of the above equation can be written as $G(y_0 + k) - G(y_0)$ where $G : (y_0 - \delta, y_0 + \delta) \to \mathbb{R}$ is defined by $G(x,y) = f(x_0 + h,y) - f(x_0,y)$. Consequently, $$ \left| \frac{G(y_0 + k) - G(y_0)}{hk} - f_{xy}(x_0,y_0)\right| = |f_{xy}(c,d) - f_{xy}(x_0,y_0)| < \epsilon. $$ Since $f_y$ exists on $D$, the function $G$ is differentiable at $y_0$ and $G'(y_0) = f_y(x_0 + h,y_0) - f_y(x_0,y_0)$. Hence, taking the limit as $k \to 0$ (with $h$ fixed), we see that for $0 < |h| < \delta$, $$ \left| \frac{f_y(x_0 + h,y_0)- f_y(x_0,y_0)}{h} - f_{xy}(x_0,y_0) \right| = \left| \frac{G'(y_0)}{h} - f_{xy}(x_0,y_0) \right| \leq \epsilon. $$ Since $\epsilon > 0$ was arbitrary, we conclude that $f_{yx}(x_0,y_0)$ exists and is equal to $f_{xy}(x_0,y_0)$.

The generalisation to $\mathbb{R}^n$ can be easily done from here.


The result does not hold if the mixed derivatives are not assumed to be continuous at $a$.

For example, consider $f : \mathbb{R}^2 \to \mathbb{R}$ given by $$ f(x,y) = \begin{cases} xy\frac{x^2-y^2}{x^2+y^2}, & (x,y) \neq (0,0);\\ 0, & (x,y) = (0,0). \end{cases} $$ Let $D_1 = D_{e_1}$ and $D_2 = D_{e_2}$, where $\{ e_1,e_2 \}$ is the standard basis of $\mathbb{R}^2$. Then, $$ D_1 f(x,y) = \begin{cases} y\frac{x^4 + 4x^2y^2 - y^4}{(x^2 + y^2)^2}, & (x,y) \neq (0,0);\\ 0, & (x,y) = (0,0), \end{cases} $$ and $$ D_2 f(x,y) = \begin{cases} x\frac{x^4 - 4x^2y^2 - y^4}{(x^2 + y^2)^2}, & (x,y) \neq (0,0);\\ 0, & (x,y) = (0,0). \end{cases} $$ So, $D_1f(x,y)$ and $D_2f(x,y)$ exist on all of $\mathbb{R}^2$. We can also compute the mixed partial derivatives. $$ D_2 D_1 f(x,y) = \begin{cases} \frac{x^6 + 9x^4y^2 - 9x^2y^4 - y^6}{(x^2 + y^2)^3}, & (x,y) \neq (0,0);\\ -1, &(x,y) = (0,0), \end{cases} $$ and $$ D_1 D_2 f(x,y) = \begin{cases} \frac{x^6 + 9x^4y^2 - 9x^2y^4 - y^6}{(x^2 + y^2)^3}, & (x,y) \neq (0,0);\\ 1, &(x,y) = (0,0). \end{cases} $$ So, $D_2 D_1 f(x,y)$ and $D_1 D_2 f(x,y)$ exist on all of $\mathbb{R}^2$ as well. Yet, $D_2 D_1 f(0,0) \neq D_1 D_2 f(0,0)$.