weighted mean of vectors in a epsilon ball.

107 Views Asked by At

I think this should be fairly straightforward, but I can't seem to find the right steps.

Let $B(u,r)$ be an open ball of radius $r$ around the point $u$ in $\mathbb{R}^n$, and suppose that $v,w \in B(u,r)$. Show that for all $t \in [0,1]$ that the vector $t\cdot v + (1-t)\cdot w \in B(u,r)$.

I've tried using several norm inequalities like Cauchy-Schwarz, and the triangle inequality, and I'm just not seeing the connection. I was also thinking we might be able to show this just straight from the definition of distance in $\mathbb{R}^n$ but that seems unsatisfying, this result seems like it should be true in any metric space.

Thanks in advance!

3

There are 3 best solutions below

0
On BEST ANSWER

Triangle inequality will do. $$\|tv+(1-t)w-u\|=\|tv-tu+(1-t)w-(1-t)u\|\le t\|v-u\|+(1-t)\|w-u\|<tr+(1-t)r=r.$$

0
On

$$ \begin{split} |t v + (1-t) w - u| & = |t(v-u) + (1-t)(w-u)| \\ & \leq t |v-u| + (1-t) |w-u| < t r + (1-t)r = r. \end{split} $$

0
On

Note that $$d(tv+(1-t)w,u)\le d(tv,u)+d((1-t)w,u)$$

$$= td(u,v)+(1-t)d(w,u) <tr+(1-t)r =r$$

Thus $$ t\cdot v + (1-t)\cdot w \in B(u,r)$$