Simple proof for complete, metric vector space

92 Views Asked by At

I have the vector space $C_c^\infty([a, b])=\{f \in C_c^\infty : supp(f) \subset [a, b] \}$ and a metric on this space with $d(\varphi, \psi)=\sum_{n=0}^{\infty} 2^{-n} \frac{\lVert \varphi-\psi \rVert_n}{1 + \lVert \varphi-\psi \rVert_n} \text{, with } \lVert \varphi \rVert_n=\sum_{j=0}^{n} \sup_{x \in [a, b]} \lvert \varphi^{(j)}(x) \rvert$ I already showed that it's a metric, but I have some trouble with completness. Is there a simple proof or consideration that helps? Or even some literature with the proof? I already found something on here with "$f_n \to f$ uniformly and $f_n' \to g$ uniformly then f is differentiable and g=f′ is repeateatly replied for the proof', but I don't know how this helps me.

1

There are 1 best solutions below

1
On BEST ANSWER

Let $(f_k)$ be a Cauchy sequence for $d$. Then, for every $n\in\Bbb N$, it is also Cauchy for the distance $(\varphi,\psi)\mapsto\frac{\lVert \varphi-\psi \rVert_n}{1 + \lVert \varphi-\psi \rVert_n}$, hence also for the norm $\|~\|_n$.

You "know that the pointwise limit $g$ exists, because $\{f_k(x)\} \subset \mathbb{R}$ and $\Bbb R$ is complete. The same is true for the derivatives (limits $g_n $)."

Now, these limits are actually not only pointwise, but even uniform. Let us prove it for instance for the first one, $f_k\to g$. Since $(f_k)$ is Cauchy for the uniform norm $\|~\|$ on $[a,b]$: $$∀ϵ>0,∃N,∀j,k≥N,\|f_k−f_j\|≤ϵ$$ i.e. $$ ∀x∈[a,b],\|f_k(x)−f_j(x)\|≤ϵ.$$ Therefore, since $(f_k)$ converges poinwise to $g$: $$∀x∈[a,b],\|g(x)−f_j(x)\|≤ϵ$$ i.e. $\|g−f_j\|≤ϵ$, which ends the proof that $∀ϵ>0,∃N,∀j≥N,\|g−f_j\|≤ϵ$.

Using repeatedly the lemma you found, we are now sure that $(f_k)$ converges uniformly to some smooth function $g$ (whose support is then necessarily a subset of $[a,b]$) such that also $f_k^{(n)}\to g^{(n)}$ uniformly, for every $n\in\Bbb N$. There remains to derive that $d(f_k,g)\to0$.

Let $\epsilon>0$. First choose some $n\in\Bbb N$ such that $\sum_{j>n}2^{-j}<\epsilon/2$. Now, since $d_j(f_k,g)$ tends to $0$ for every $j\in\{0,\dots,n\}$, so does $\sum_{j=0}^n2^{-j}d_j(f_k,g)$, which is therefore $<\epsilon/2$ for every large enough $k$. For such $k$s, $d(f_k,g)<\epsilon$, q.e.d.