Existence and uniqueness of projections on closed convex sets

1.7k Views Asked by At

Let $X$ be a normed vector space. I'm interested in determining what are the minimal assumptions on $X$ that guarantee the existence and uniqueness of projections on closed convex sets and in counterexamples showing that those assumptions are indeed necessary.

In particular let P1 and P2 be the following statements

P1: (existence) for every convex closed set $C\subseteq X$ and every $x\in X$ there exist $y\in C$ with $\|x-y\|=d(x,C)$.

P2: (uniqueness) for every convex closed set $C\subseteq X$ and every $x\in X$ there exist a unique $y\in C$ with $\|x-y\|=d(x,C)$.

What I know so far is that P2 holds in all uniformly convex Banach spaces, while to get P1 is enough to assume that $X$ is a reflexive Banach space, but I don't have an example of a reflexive Banach space not satisfying P2 and I don't know if those assumptions can be further weakened.

3

There are 3 best solutions below

2
On BEST ANSWER

Answering my own question since I found a good reference.

Let $X$ be a normed space and $C\subseteq X$. Then $C$ is called

  • A set of uniqueness iff for every $x\in X$ there is at most one $c\in C$ with $d(x,C)=d(x,c)$
  • A set of existence iff for every $x\in X$ there is $c\in C$ with $d(x,C)=d(x,c)$
  • A Chebyshev set iff it is both a set of uniqueness and a set of existence

We have the following results:

Theorem: Let $X$ be a normed space, then the following are equivalent

  • $X$ is strictly convex
  • Every nonempty convex set in $X$ is a set of uniqueness
  • Every nonempty closed convex set in $X$ is a set of uniqueness

Theorem: Let $X$ be a normed space, then the following are equivalent

  • $X$ reflexive
  • Every nonempty closed convex set in $X$ is a set of existence

Theorem (Day-James): Let $X$ be a normed space, then the following are equivalent

  • $X$ reflexive and strictly convex
  • Every nonempty closed convex set in $X$ is a Chebyshev set

All of these results can be found in Megginson's "An Introduction to Banach Space Theory" as theorem 5.1.17, corollary 5.1.19 and subsequent remarks.

3
On

Take $R^2$ with norm $\|(x,y)\| = \max \{|x| , |y| \}$ it is clearly reflexive space.

now consider the line $x = 1$ in $R^2$ as our convex set $C$, and the point $(0 , 0)$. Then all points in the form $(1 , y)$ with $-1 \leq y \leq 1$ on the line serve as nearest point to $(0 , 0)$.

So being reflexive in not enough. You have to seek conditions explicitly on the norm of the space, topological behavior is not enough. I think norm has to be strictly convex.

0
On

Even though OP did not asked for proofs for both statements, I cannot stop writing down two proofs, and providing them here for future reference.

So, the statement P1 goes as follows:

Let $(V,\|\ \cdot\ \|)$ be a real Banach space and $K\subset V$ be a (norm) closed convex set. Assume that $V$ is reflexive. Show that for all $x\in V$ there exists $y\in K$ such that $$\|x-y\|=\inf_{z\in K}\|x-z\|=\text{dist}(x, K).$$

Proof:

We firstly prove that $\varphi:V\longrightarrow(-\infty,\infty]$ defined by $\varphi(x):=\|x\|$ is a convex and continuous function. Let $\theta\in [0,1]$ and $x,y\in V$, then by the triangle inequality, we have $$\varphi(\theta x+(1-\theta)y)=\|\theta x+(1-\theta)y\|\leq \|\theta x\|+\|(1-\theta)y\|=\theta\|x\|+(1-\theta)\|y\|=\theta\varphi(x)+(1-\theta)\varphi(y).$$

Now, let $\epsilon>0$ and $(x_{n})$ be a sequence in $V$ that converges to $x\in V$. Then, by definition, there exists $N$ large enough such that for all $n>N$, we have $\|x_{n}-x\|<\epsilon$. On the other hand, by (inverse) triangle inequality $$\epsilon>\|x_{n}-x\|\geq \big|\ \|x_{n}\|-\|x\|\ \big|=|\varphi(x_{n})-\varphi(x)|.$$ This means that when $x_{n}\rightarrow x$ we have that $\varphi(x_{n})\rightarrow\varphi(x)$, as desired.

In particular, the norm is convex and (strong) lower semi-continuous.

Let $V$ be a reflexive real Banach space and $K\subset V$ a (norm) closed convex subset. Let $x\in V$, consider a sequence $(y_{n})_{n=1}^{\infty}$ such that $y_{n}\in K$ for all $n$ and $$\lim_{n\rightarrow\infty}\|x-y_{n}\|=\text{dist}(x,K)=\inf_{z\in K}\|x-z\|.$$

Note that such a sequence exists. Indeed, as the set $\{\|x-z\|:z\in K\}$ is bounded below by zero, the infimum $\inf_{z\in K}\|x-z\|$ exists. We denote $d$ to be this infimum. By definition, for each $n\in\mathbb{N}$, there exists $y_{n}\in K$ such that $$d\leq \|x-y_{n}\|\leq d+\dfrac{1}{n}.$$

Taking $n\rightarrow\infty$, we see that $\lim_{n\rightarrow\infty}\|x-y_{n}\|=d.$ Such a (minimizing) sequence $(y_{n})_{n=1}^{\infty}$ is exactly what we need. In particular, this sequence is (norm) bounded because $\text{dist}(x,K)<\infty$.

As $V$ is reflexive and $(y_{n})_{n=1}^{\infty}$ is a (strongly) bounded sequence, it follows from [Theorem 3.18, Haim] that there exists a subsequence $(y_{n_{k}})$ that converges in the weak topology $\sigma(V,V^{*})$.

Let $y$ be the weak limit of this subsequence. As $K$ is strongly closed (since it is norm closed) and $K$ is convex, it follows from [Theorem 3.7], Haim that $K$ is weakly closed.

Hence, $y\in K$.

As we have proved that the norm is convex and (strongly) lower semi-continuous, it follows from [Corollary 3.9, Haim] that the norm is weakly lower semi-continuous.

It follows that $$\|x-y\|\leq\liminf_{n_{k}\rightarrow\infty}\|x-y_{n_{k}}\|=\inf_{z\in K}\|x-z\|.$$ But $\inf_{z\in K}\|x-z\|\leq \|x-u\|$ for all $u\in K$, and thus in particular $\inf_{z\in K}\|x-z\|\leq \|x-y\|.$

Hence, we have $\|x-y\|=\inf_{z\in K}\|x-z\|$, which concludes the proof.


Now, let us recall the definition of uniform convexity:

$V$ is said to be uniformly convex if for all $\epsilon\in(0,2)$, there exists $\delta_{\epsilon}>0$ such that for all $x,y\in V$, one has $$\|x\|\leq 1,\|y\|\leq 1\ \text{and}\ \|x-y\|\geq\epsilon\implies \Bigg\|\dfrac{x+y}{2}\Bigg\|\leq 1-\delta_{\epsilon}.$$

Then, statement P2 is the following:

Assume that $V$ is uniformly convex, show that for all $x\in V$, there exists a unique $y\in K$ such that $$\|x-y\|=\inf_{z\in K}\|x-z\|=\text{dist}(x,K).$$

Proof:

To show the existence of such a unique element, the key is to show that a minimizing sequence is Cauchy.

Let $x\in V$, we denote $d:=\text{dist}(x,K)$. Then, we consider a minimizing sequence $(y_{n})_{n=1}^{\infty}$ whose existence has been proved in part $(i)$. For each $n$, we define $$x_{n}:=y_{n}-x, \ \ d_{n}:=\|y_{n}-x\|\ \ \text{and}\ \ z_{n}:=\dfrac{x_{n}}{d_{n}}.$$

It is clear that $z_{n}$ is a unit vector for each $n$ and $d_{n}\rightarrow d$ as $n\rightarrow\infty$. Also, we can write \begin{align*} \dfrac{z_{n}+z_{m}}{2}=\dfrac{x_{n}}{2d_{n}}+\dfrac{x_{m}}{2d_{m}}=\Bigg(\dfrac{1}{2d_{n}}+\dfrac{1}{2d_{m}}\Bigg)(&c_{n}x_{n}+c_{m}x_{m}),\\ &\text{where}\ \ c_{n}:=\dfrac{1}{2d_{n}(\frac{1}{2d_{n}}+\frac{1}{2d_{m}})}\ \ \text{and}\ \ c_{m}:=\dfrac{1}{2d_{m}(\frac{1}{2d_{n}}+\frac{1}{2d_{m}})}. \end{align*} Note that $c_{n}+c_{m}=1$. Indeed, $$c_{n}+c_{m}=\dfrac{1}{1+\frac{d_{n}}{d_{m}}}+\dfrac{1}{1+\frac{d_{m}}{d_{n}}}=\dfrac{1+\frac{d_{m}}{d_{n}}+1+\frac{d_{n}}{d_{m}}}{(1+\frac{d_{n}}{d_{m}})(1+\frac{d_{m}}{d_{n}})}=\dfrac{1+\frac{d_{m}}{d_{n}}+1+\frac{d_{n}}{d_{m}}}{1+\frac{d_{m}}{d_{n}}+\frac{d_{n}}{d_{m}}+1}=1.$$

Therefore, by convexity of $K$, we know that $(c_{n}y_{n}+c_{m}y_{m})\in K$. Hence, it follows that \begin{align*} \Bigg\|\dfrac{z_{n}+z_{m}}{2}\Bigg\|=\Bigg\|\dfrac{x_{n}}{2d_{n}}+\dfrac{x_{m}}{2d_{m}}\Bigg\|&=\Bigg(\dfrac{1}{2d_{n}}+\dfrac{1}{2d_{m}}\Bigg)\|c_{n}y_{n}+c_{m}y_{m}-c_{n}x-c_{m}x\|\\ &=\Bigg(\dfrac{1}{2d_{n}}+\dfrac{1}{2d_{m}}\Bigg)\|c_{n}y_{n}+c_{m}y_{m}-(c_{n}+c_{m})x\|\\ &=\Bigg(\dfrac{1}{2d_{n}}+\dfrac{1}{2d_{m}}\Bigg)\|c_{n}y_{n}+c_{m}y_{m}-x\|\\ &=\Bigg(\dfrac{1}{2d_{n}}+\dfrac{1}{2d_{m}}\Bigg)\|x-(c_{n}y_{n}+c_{m}y_{m})\|\\ &\geq \Bigg(\dfrac{1}{2d_{n}}+\dfrac{1}{2d_{m}}\Bigg)\inf_{z\in K}\|x-z\|\\ &=\dfrac{d}{2d_{n}}+\dfrac{d}{2d_{m}}. \end{align*}

Now, let $\epsilon\in (0,2)$ and let $\delta_{\epsilon}$ be the correspondence in the definition of uniform convexity. Then, we can find a $\epsilon_{1}>0$ small enough such that $\frac{d}{d+\epsilon_{1}}>1-\delta_{\epsilon}$ and an $N$ large enough such that $d\leq d_{n}<d+\epsilon_{1}$ for all $n\geq N$. (Such an $N$ exists by definition of infimum and $d\leq d_{n}$ because $d$ is the infimum.)

We now show that $(z_{n})_{n=1}^{\infty}$ is Cauchy. To show this, we need to show that for all $n,m\geq N$, we have $\|z_{n}-z_{m}\|=\|\frac{x_{n}}{d_{n}}-\frac{x_{m}}{d_{m}}\|<\epsilon.$ Suppose not, then there exists $n,m\geq N$ such that $$\|z_{n}-z_{m}\|=\Bigg\|\dfrac{x_{n}}{d_{n}}-\dfrac{x_{m}}{d_{m}}\Bigg\|\geq\epsilon.$$ But then, $$\Bigg\|\dfrac{z_{n}+z_{m}}{2}\Bigg\|\geq \dfrac{1}{2}\Bigg(\dfrac{d}{d_{n}}+\dfrac{d}{d_{m}}\Bigg)>\dfrac{1}{2}\Bigg(\dfrac{d}{d+\epsilon_{1}}+\dfrac{d}{d+\epsilon_{1}}\Bigg)>\dfrac{1}{2}\cdot(1-\delta_{\epsilon}+1-\delta_{\epsilon})=1-\delta_{\epsilon}.$$

To summarize, we have $\|z_{n}\|=1=\|z_{m}\|$ with $\|z_{n}-z_{m}\|\geq\epsilon$, but then $\|\frac{z_{n}+z_{m}}{2}\|\geq 1-\delta_{\epsilon}.$ This contradicts the uniform convexity, and thus $(z_{n})_{n=1}^{\infty}$ must be Cauchy.

(Just a side note here that $N$ indeed depends on $\epsilon$, so the Cauchy argument makes sense. It depends on $\epsilon$, because we choose $N$ that depends on $\epsilon_{1}$, but $\epsilon_{1}$ depends on $\delta_{\epsilon}$ which then depends on $\epsilon$, so we are good.)

As $(z_{n})$ is Cauchy and $d_{n}, d_{m}\rightarrow d$, we have $$0=\lim_{n,m\rightarrow\infty}\|z_{n}-z_{m}\|=\lim_{n,m\rightarrow\infty}\Bigg\|\dfrac{x_{n}}{d_{n}}-\dfrac{x_{m}}{d_{m}}\Bigg\|=\lim_{n,m\rightarrow\infty}\Bigg\|\dfrac{y_{n}-x-y_{m}+x}{d}\Bigg\|,$$ which implies that $$\lim_{n,m\rightarrow\infty}\|y_{n}-y_{m}\|=0.$$

This shows that the minimizing sequence $(y_{n})$ is Cauchy. As $V$ is complete and $K$ is closed, $y_{n}$ converges to an element $y\in K$.

As norm is continuous, we have $\|x-y_{n}\|\rightarrow \|x-y\|$ but as $(y_{n})$ is a minimizing sequence, by definition we have $\|x-y_{n}\|\rightarrow \text{dist}(x,K).$

Therefore, we proved the existence of $y\in K$ such that $\|x-y\|=\text{dist}(x,k).$

To show the uniqueness of such $y$, suppose there exist $y\neq y'\in K$ such that $$\|x-y\|=\|x-y'\|=\text{dist}(x,K):=d$$ Then, as $y\neq y'$, we must have $$\Bigg\|\dfrac{x-y}{d}-\dfrac{x-y'}{d}\Bigg\|=\dfrac{1}{d}\|y-y'\|\geq\epsilon>0,\ \ \text{for some (actually any)}\ \ \epsilon\in (0,2).$$

As the space is uniformly convex and $\|\frac{x-y}{d}\|=1=\|\frac{x-y'}{d}\|$, there exists $\delta_{\epsilon}>0$ such that $$\dfrac{1}{2}\Bigg\|\dfrac{x-y}{d}+\dfrac{x-y'}{d}\Bigg\|=\dfrac{1}{d}\Bigg\|x-\dfrac{y+y'}{2}\Bigg\|\leq 1-\delta_{\epsilon}<1.$$ This implies that $$\Bigg\|x-\dfrac{y+y'}{2}\Bigg\|<d.$$

But $K$ is convex and $y,y'\in K$ and thus $\frac{y+y'}{2}=\frac{1}{2}y+(1-\frac{1}{2})y'\in K$ as well. Then the above strict inequality gives us a contradiction because $d$ is the infimum of $\|x-z\|$ where $z$ is over all elements of $K$.

Hence, $y$ must be unique. The proof is concluded.