Let $\| \cdot\| $ be a strictly convex norm, meaning that if $u \neq v$ and $u,v \in B=\{ x: \lVert x\rVert\ \leq 1\}$ then for all $\lambda \in (0,1),$ $\lambda u + (1-\lambda)v \in \operatorname{interior}(B)$ , i.e. $\lVert \lambda u + (1-\lambda)v\rVert < 1$.
I want to prove the equivalence $a \iff b$, where:
(a) $\| \cdot\| $ is a strictly convex norm
(b) $\lVert u+v\rVert = \lVert u\rVert + \lVert v \rVert \Rightarrow u=\alpha v, \alpha \in \Re_{+} $
I don't get anywhere whatever approach i take. Any tips?
The equivalence is a bit off, $(b)$ should be $$\|u+v\| = \|u\| + \|v\| \implies u = \alpha v, \text{ for some } \alpha > 0$$
First notice that if $\|x+y\| = \|x\| + \|y\|$ holds, then for all $\alpha, \beta \ge 0$ we also have $$\|\alpha x+\beta y\| = \alpha\|x\| + \beta\|y\|$$
Namely:
\begin{align} \alpha\|x\| + \beta\|y\| &= \alpha(\|x\| + \|y\|) - (\alpha - \beta)\|y\| \\ &= \|\alpha(x+y)\| - \|(\alpha - \beta)y\|\\ &\le \|\alpha(x+y) - (\alpha - \beta)y\|\\ &= \|\alpha x + \beta y\|\\ &\le \alpha\|x\| + \beta\|y\| \end{align}
Assume that $\|\cdot\|$ is a convex norm and that $\|x+y\| = \|x\| + \|y\|$ for some $x, y \ne 0$.
For $\lambda \in \langle 0, 1\rangle$, setting $\alpha = \frac{\lambda}{\|x\|}$, $\beta = \frac{1-\lambda}{\|y\|}$ we have
$$\left\|\lambda\frac{x}{\|x\|} + (1-\lambda)\frac{y}{\|y\|}\right\| = \lambda\frac{\|x\|}{\|x\|} + (1-\lambda)\frac{\|y\|}{\|y\|} = 1$$
Since $\frac{x}{\|x\|}, \frac{y}{\|y\|} \in B$, strict convexity implies $$\frac{x}{\|x\|}= \frac{y}{\|y\|} \implies x = \underbrace{\frac{\|x\|}{\|y\|}}_{> 0}y$$
Conversely, suppose that $(b)$ holds and assume $\|\lambda u + (1-\lambda)v\| = 1$ for some $u, v \in B$ and $\lambda \in \langle 0, 1\rangle$.
$$\|\lambda u + (1-\lambda)v\| \le \lambda \underbrace{\|u\|}_{\le 1} + (1-\lambda)\underbrace{\|v\|}_{\le 1} \le \lambda + (1-\lambda) = 1$$
Therefore $\|u\| = \|v\| = 1$ so we have: $$\|\lambda u + (1-\lambda)v\| = 1 = \|\lambda u\| + \|(1-\lambda)v\|$$
$(b)$ implies there exists $\alpha > 0$ such that $\lambda u = \alpha (1- \lambda) v$. Taking the norm yields $\alpha = \frac{\lambda}{1-\lambda}$ so $u = v$.