$f_n$ converge uniformly to $f$ then $\mathrm{d}f_n(x_n)$ converges to $\mathrm{d}f(x)$

276 Views Asked by At

Let $f_n : \mathbb{R}^p \to \mathbb{R}$ such that the $f_n$ are $C^1$ and such that the sequence $(f_n)_{n \in \mathbb{N}}$ converges uniformly to a function $f : \mathbb{R}^p \to \mathbb{R}$ which is $C^1$. Then prove that for all $x \in \mathbb{R}^n$ there is a sequence $(x_n)_{n \in \mathbb{N}}$ which converge to $x$ such that $\mathrm{d}f_n(x_n)$ converges to $\mathrm{d}f(x)$.

I must say that I don't know at all how to do and don't have any intuition of what is really going on here. So we might look at the case qhere $p= 1$.

So we can write :

$$f(a+h) = f(a)+ f'(a)h +o(h)$$ $$\forall n \in \mathbb{N}, f_n(a+h) = f_n(a) + f'_n(a)h +o(h)$$

Hence we have :

$$\mid f'(a)h - f'_n(a)h \mid \leq \mid f(a+h)-f(a) \mid +\mid f(a)-f_n(a) \mid + \mid o(h) \mid$$

Since the function $f_n$ converge uniformly to $f$, we have : $$\mid f'(a)h - f'_{\infty}(a) \mid \leq \mid o(h) \mid$$ And now using we let $h \to 0$ so that :

$$\mid f'(a) -f_\infty'(a) \mid = 0 $$

I don't know if this works, but it feels strange to me since in the case the sequence $x_n$ is just the constant sequence... and moreover if this is correct I don't see at all how to generalise to higher dimensions.

Thank you !

2

There are 2 best solutions below

1
On BEST ANSWER

We first prove the result in the case where there is a local maximum. At a maximum, the differential vanishes, so the claim is here:

Lemma. If $(g_n)_n$ is a sequence of $C^1$ functions $\mathbb{R}^p\to \mathbb{R}$ converging uniformly to a $C^1$ function $g$ having a local (strict) maximum at $y$, that is $g(x)<g(y)$ for all $x\neq y$ in a ball neigborhood $B(y,r)$ of $y$, then there exist a sequence $x_n$ such that $\lim x_n=y$ and $dg_n(x_n)=0$ for sufficiently large $n$.

Proof of the Lemma: Pick $N$ sufficiently large so that $\forall n\geq N$, $$sup_{ \|x-y\|=r} g_n(x)<g_n(y).$$ The existence of such an $N$ follows from the fact the corresponding inequality is true for $g$ by hypothesis, and the uniform convergence of $g_n$. For any $n\geq N$, pick $x_n$ to be a maximum of $g_n$ on $B(y,r)$. Because of the previous inequality, $x_n$ is in the interior of the ball $B(y,r)$. So the derivative satisfies $dg_n(x_n)=0$.

Let $x$ be a limit point of a subsequence of $(x_n)$. Since $g_n(x_n)\geq g_n(y)$ by definition of $x_n$, taking the limit we have $g(x)\geq g(y)$, and of course $x_n$ is still in the closed ball $B(y,r)$. So necessarily $x=y$ since $y$ is a local strict maximum of $g$ on $B(y,r)$. This concludes the proof of the Lemma.

Now, to deal with the general case, pick a point $y$ and define $$g_n(x)=f_n(x)-f(x)-\|x-y\|^2.$$ Clearly, this sequence of $C^1$ functions converges to $g(x)=-\|x-y\|^2.$ We now apply the Lemma, so there is a sequence $(x_n)_n$ such that $\lim x_n=y$, and $dg_n(x_n)=0$. But since $$dg_n(x_n).h=df_n(x_n).h-df(x_n).h-2\langle x_n-y, h \rangle,$$ so $$df_n(x_n)=df(x_n)+ 2\langle x_n-y, . \rangle.$$ The linear forms $h\mapsto 2\langle x_n-y, h \rangle$ converges to zero because of Cauchy-Scharwz inequality, and $df(x_n)$ converges to $df(y)$ since $f$ is $C^1$. This concludes the proof.

4
On

In one dimension we don't need uniform convergence; pointwise converge will do. Also we only need $f$ and $f_n, n=1,2,\dots$ differentiable everywhere, not necessarily $C^1.$

WLOG we can assume $f\equiv 0$ because $f_n(x)-f(x)\to 0$ everywhere and $f_n-f$ is differentiable everywhere.

Fix $x\in \mathbb R.$ Let $\delta > 0.$ Then for $n\in \mathbb N$ the MVT shows there exists $c(n,\delta)\in (x,x+\delta)$ such that

$$f_n(x+\delta)- f_n(x) = f_n'(c(n,\delta))\delta.$$

Since the left side $\to 0$ as $n\to \infty,$ we can make the right side as small as we like by taking $n$ large. We can thus find $N = N_\delta$ such that $n\ge N_\delta$ implies $|f_n'(c(n,\delta))| < \delta.$

Now think of $\delta_k = 1/k, k=1,2,\dots .$ Then from the above there exist integers $0<N_1<N_2 < \cdots$ such that $N_k\le n < N_{k+1}$ implies $|f_n'(c(n,1/k))| < 1/k.$ If we then define

$$x_n = c(n,1/k),\,\, N_k\le n N_{k+1},$$

we have $x_n\to 0$ and $f'(x_n)\to 0.$ (We can let $x_n$ be anything for $1\le n <N_1.$)