Let $S = \{ (x,y,z) \in \mathbb R^3 : x^2 + y^2 + z^2 \le 1\}$. Prove $(1-\lambda)x + \lambda y \in S$ for $x=\lambda'y$, $\lambda' < 0$.

58 Views Asked by At

Let $S = \{ (x,y,z) \in \mathbb R^3 : x^2 + y^2 + z^2 \le 1\}$.

I've verified that $x,y \in S$ implies $(1-\lambda)x + \lambda y \in S$ when $x,y$ are linearly independent using Pythagoras and when $x = \lambda' y$ for $\lambda' \ge 0$. $ \\ \\ \\ \\ \\ \\ \\ \\$ ($ 0\le\lambda \le 1$)

However, I've trouble with the case $\lambda' < 0$.

Considering $|(1-\lambda)x + \lambda y|^2=|(1-\lambda)\lambda'y + \lambda y|^2$, I don't get inequality saying $\le 1$.

$| x|$ denote the Euclidean distance.

Can someone help me out ?

1

There are 1 best solutions below

7
On

We have $$(1-\lambda)\lambda'y+\lambda y=(\lambda' +\lambda- \lambda\lambda')y$$ Now, we may assume without loss of generality that $|y|\geq|x|$. This gives $-1\leq \lambda'< 0$, which again gives $$-1\leq \lambda'-\lambda'\lambda\leq 0$$ If we add $\lambda$ to this, we still have $\leq 1$, and since that is the coefficient of $y$, we are done.