I think this should be fairly straightforward, but I can't seem to find the right steps.
Let $B(u,r)$ be an open ball of radius $r$ around the point $u$ in $\mathbb{R}^n$, and suppose that $v,w \in B(u,r)$. Show that for all $t \in [0,1]$ that the vector $t\cdot v + (1-t)\cdot w \in B(u,r)$.
I've tried using several norm inequalities like Cauchy-Schwarz, and the triangle inequality, and I'm just not seeing the connection. I was also thinking we might be able to show this just straight from the definition of distance in $\mathbb{R}^n$ but that seems unsatisfying, this result seems like it should be true in any metric space.
Thanks in advance!
Triangle inequality will do. $$\|tv+(1-t)w-u\|=\|tv-tu+(1-t)w-(1-t)u\|\le t\|v-u\|+(1-t)\|w-u\|<tr+(1-t)r=r.$$