In a uniformly convex Banach space $x_n\stackrel{w}\to x$ and $||x_n||\to ||x||$ implies $||x_n-x||\to 0$

1000 Views Asked by At

In a uniformly convex Banach space $$x_n\stackrel{w}\to x \ \ \text{and} \ \ ||x_n||\to ||x|| \ \ \text{implies} \ ||x_n-x||\to 0.$$ Can you help me to solve it?

Thanks in advance.

1

There are 1 best solutions below

2
On

The case $x = 0$ is immediate, since $\lVert x_n\rVert\to 0$ is just the norm convergence of $(x_n)$ to $0$.

So let's suppose $x\neq 0$, and, without loss of generality, $\lVert x\rVert = 1$. By the Hahn-Banach theorem, there is a continuous linear functional $\lambda$ with $\lVert\lambda\rVert = 1$ and $\lambda(x) = 1$.

Since $x_n \xrightarrow{w} x$, we have $\lambda(x_n) \to \lambda(x) = 1$, and therefore

$$\lim_{n\to\infty} \lambda\left(\tfrac{1}{2}(x_n+x)\right) = 1,$$

which implies

$$\liminf_{n\to\infty} \left\lVert \tfrac{1}{2}(x_n+x)\right\rVert \geqslant 1.$$

On the other hand, since $\left\lVert\frac{1}{2}(x_n+x)\right\rVert \leqslant \frac{1}{2}(\lVert x_n\rVert + \lVert x\rVert)$, we have

$$\limsup_{n\to\infty} \left\lVert \tfrac{1}{2}(x_n+x)\right\rVert \leqslant 1,$$

so

$$\lim_{n\to\infty} \left\lVert\tfrac{1}{2}(x_n+x)\right\rVert = 1.$$

But a space is uniformly convex if and only if for all sequences $(a_n)$ and $(b_n)$ with $\lim\limits_{n\to\infty} \lVert a_n\rVert = 1 = \lim\limits_{n\to\infty} \lVert b_n\rVert$ and $\lim\limits_{n\to\infty} \left\lVert \frac{1}{2}(a_n+b_n)\right\rVert = 1$ it follows that $\lim\limits_{n\to\infty} \lVert a_n - b_n\rVert = 0$.

Pick $a_n = x_n$ and $b_n = x$.

This characterisation of uniform convexity may be part of the definition, or be a lemma to be proved. In that case, we only need one direction here:

Suppose $\lVert a_n - b_n\rVert \not\to 0$. Then there is an $\varepsilon > 0$ such that $\lVert a_{n_k} - b_{n_k}\rVert > 2\varepsilon$ for subsequences $(a_{n_k})$ and $(b_{n_k})$. We may assume that this holds for the full sequences. For that $\varepsilon$, by uniform convexity, there is a $\delta > 0$ such that

$$\left\lVert \tfrac{1}{2}(u+v)\right\rVert \leqslant 1 - \delta$$

for all $u,v$ with $\lVert u\rVert = 1 = \lVert v\rVert$ and $\lVert u-v\rVert \geqslant \varepsilon$.

Then, setting $\alpha_n = \frac{1}{\lVert a_n\rVert}a_n$ and $\beta_n = \frac{1}{\lVert b_n\rVert}b_n$ ($a_n = 0$ or $b_n = 0$ can happen only finitely often, we drop these terms), we have

$$\begin{align} \left\lVert\alpha_n-\beta_n\right\rVert &\geqslant \lVert a_n-b_n\rVert - \lVert a_n-\alpha_n\rVert - \lVert b_n-\beta_n\rVert\\ &= \lVert a_n-b_n\rVert - \lvert 1 - \lVert a_n\rVert\rvert - \lvert 1-\lVert b_n\rVert\rvert\\ &> 2\varepsilon - \lvert 1 - \lVert a_n\rVert\rvert - \lvert 1-\lVert b_n\rVert\rvert\\ &> \varepsilon \end{align}$$

for $n$ so large that $\lvert 1-\lVert a_n\rVert\rvert < \varepsilon/2$ and $\lvert 1-\lVert b_n\rVert\rvert < \varepsilon/2$. But then

$$\begin{align} \left\lVert \tfrac{1}{2}(a_n+b_n)\right\rVert &\leqslant \left\lVert \tfrac{1}{2}(\alpha_n+\beta_n)\right\rVert + \tfrac{1}{2}(\lVert \alpha_n - a_n\rVert + \lVert \beta_n-b_n\rVert)\\ &\leqslant 1-\delta + \tfrac{1}{2}(\lVert \alpha_n - a_n\rVert + \lVert \beta_n-b_n\rVert)\\ &\leqslant 1 - \frac{\delta}{2} \end{align}$$

for $n$ so large that also $\lvert 1-\lVert a_n\rVert\rvert < \delta/2$ and $\lvert 1-\lVert b_n\rVert\rvert < \delta/2$, contradicting the assumption $\left\lVert \frac{1}{2}(a_n+b_n)\right\rVert \to 1$.


Summary: In a uniformly convex space, the curvature of spheres is "sufficiently large" in all directions that the intersection of a spherical shell $a-\varepsilon < \lVert y\rVert < a+\varepsilon$ with a "thickened supporting hyperplane" $a-\varepsilon < \operatorname{Re} \lambda(y) < a+\varepsilon$ becomes arbitrarily small as $\varepsilon \to 0$.