If there are $2$ linearly independent vectors $x,y \in X$ such that $||x+y||=||x||+||y||$, then the unit sphere $S(X)$ contains an interval

247 Views Asked by At

Let $S(X)= \{x \in X: ||x||=1\}$ be the unit sphere in $X$. Assume that there are $x,y\in X$ linearly independent such that $||x+y||=||x||+||y||$. Prove that $S(X)$ contains the following set:$[x,y]=\{z\in X: z=tx+(1-t)y, t\in [0,1]\}$ for some $x,y$.

So it is obvious that I need to use the $x,y$ that are given to be linearly independent and form a $[x,y]$ in $S(X)$ but I don't know how to start.

5

There are 5 best solutions below

0
On BEST ANSWER

This is similar to user1551's answer, but a little simpler.

For the given $x, y$, and all $\lambda,\mu \geqslant 0$, we have as usual $$ \|\lambda x + \mu y\| \leqslant \lambda\|x\| + \mu\|y\|, $$ but also \begin{align*} (\lambda + \mu)(\|x\| + \|y\|) & = (\lambda + \mu)(\|x + y\|) \\ & = \|(\lambda + \mu)(x + y)\| \\ & = \|(\lambda x + \mu y) + \mu x + \lambda y\| \\ & \leqslant \|\lambda x + \mu y\| + \mu\|x\| + \lambda\|y\|, \end{align*} therefore $$ \|\lambda x + \mu y\| \geqslant \lambda\|x\| + \mu\|y\|, $$ therefore $$ \boxed{\|\lambda x + \mu y\| = \lambda\|x\| + \mu\|y\|} $$ Putting $\hat{x} = x/\|x\|$, $\hat{y} = y/\|y\|$, we have $\hat{x} \ne \hat{y}$, $\|\hat{x}\| = \|\hat{y}\| = 1$, and if $0 \leqslant t \leqslant 1$, \begin{align*} \|t\hat{x} + (1 - t)\hat{y}\| & = \left\lVert\frac{t}{\|x\|}x + \frac{1 - t}{\|y\|}y\right\rVert \\ & = \frac{t}{\|x\|}\|x\| + \frac{1 - t}{\|y\|}\|y\| \\ & = 1. \end{align*}

3
On

From $\|x+y\|=\|x\|+\|y\|$, $$\|x\|^2+2\|x\|\|y\|+\|y\|^2=\|x+y\|^2=\|x\|^2+\|y\|^2+2\langle x,y\rangle,$$ hence $\langle x,y\rangle =\|x\|\|y\|$. Then $$\|ax+by\|^2=a^2\|x\|^2+b^2\|y\|^2+2ab\langle x,y\rangle =a^2\|x\|^2+b^2\|y\|^2+2ab\|x\|\|y\|=(a\|x\|+b\|y\|)^2,$$ i.e., $$ \|ax+by\|=a\|x\|+b\|y\|.$$ In particular, with $a=\frac t{\|x\|}$ and $b=\frac{1-t}{\|y\|}$, $$ \left\|t\frac{x}{\|x\|}+(1-t)\frac{y}{\|y\|}\right\|=1.$$

0
On

Let's say we have a closed, bounded, convex region $R$ in the plane. Suppose that

  • distinct points $O$, $A$, $B$, and $C$ lie in $R$,
  • $A$, $B$, and $C$ lie on the boundary and $O$ lies on the interior, and
  • $C$ lies on line segment $AB$.

Then the whole of line segment $AB$ lies on the boundary $R$. For suppose to the contrary that some point $D$ on $AB$ does not: then ray $OD$ strikes the boundary of $R$ at some distinct point $E$, since $R$ is bounded, and all of segments $BE$ and $EA$ lie in $R$. Assume without loss of generality that $D$ is between $B$ and $C$. Then ray $OC$ strikes segment $EA$ at some point $F$ and $C$ lies between $O$ and $F$. But that's impossible, since that would put $C$ on the interior of $R$.

To map back to the original problem: the plane in question is the span of $x$ and $y$, $O$ is the zero vector, $A$ is $x/\lVert x \rVert$, $B$ is $y/\lVert y \rVert$, $C$ is $(x+y)/\lVert x + y\rVert$, and $R$ is the restriction of the closed unit ball to the span of $x$ and $y$.

Norm as Minkowski functional

0
On

Pick an arbitrary $p\in[\frac12,1]$ and let $q=1-p\in[0,\frac12]$. Then \begin{align} \|x\|+\|y\| = \|x+y\| &= \|(px+qy)+(qx+py)\|\\ &\le \|px+qy\|+\|qx+py\|\\ &\le(\|px\|+\|qy\|)+(\|py\|+\|qx\|)\tag{1}\\ &=(p\|x\|+q\|y\|)+(p\|y\|+q\|x\|)\\ &=\|x\|+\|y\| \end{align} and hence equalities must hold in $(1)$. Therefore, $\|px+(1-p)y\|=\|px\|+\|(1-p)y\|=p\|x\|+(1-p)\|y\|$ for every $p\in[0,1]$.

Consequently, $\|ax+by\|=a\|x\|+b\|y\|$ for every $a,b\ge0$. As $x,y$ are linearly independent, they are nonzero and we may normalise them to unit vectors $u=\frac{x}{\|x\|}$ and $v=\frac{y}{\|y\|}$. By absorbing $\|x\|,\|y\|$ into $a,b$ respectively, we obtain $\|au+bv\|=a+b$ for every $a,b\ge0$. In particular, when $a=t\in[0,1]$ and $b=1-t$, we have $[u,v]\subset S(X)$.

0
On

Define $$ \phi(\alpha)=\|x+\alpha y\|-\|x\|-\alpha\|y\|,\quad \alpha\ge 0. $$ It is a convex function, $\phi(\alpha)\le 0$ and $\phi(0)=\phi(1)=0$. Then from convexity $\phi(\alpha)=0$, $\forall\alpha\ge 0$. Hence, $$ \|x+\alpha y\|=\|x\|+\alpha\|y\|,\quad \forall\alpha\ge 0. $$ Now define $$ \hat x=\frac{x}{\|x\|},\quad \hat y=\frac{y}{\|y\|},\quad t=\frac{\|x\|}{\|x\|+\alpha\|y\|}\in(0,1]. $$ We have $$ \|t\hat x+(1-t)\hat y\|=1. $$