Im starting to learn TVS and In a textbook by Kantorovich on Functional Analysis, there is a lemma I have a slight trouble understanding its proof. It is stated as: LEMMA 1.
- Let $p$ be a non-negative gauge function. Then for any $\lambda \gt 0$ the sets $\{ x: p(x) \lt \lambda \}$ and $\{ x: p(x) \leq \lambda \}$ are convex and absorbent. If $p$ is a semi-norm, then these sets are absolutely convex.
Proof. 1) We prove only that the set $E_\lambda$ = $\{ x: p(x) \lt \lambda \}$ is absorbent. If $m = max(p(x), p(-x))$, then for $|\mu| \geq \lambda$ we have^
$$p(x/\mu) = 1/|\mu|p(sign\mu.x) \leq \lambda/(m+1)p(sign\mu.x) \lt \lambda$$, and so $x \in \mu E_\lambda$. Hence $E_\lambda$ is absorbent.
The part where I do not get is in the last inequality, how the author deduced $1/|\mu|\leq \frac{\lambda}{m+1}$ (**)
^sign $\mu = \{|\mu|/\mu$ if $\mu \neq 0$ and 0 if $\mu = 0$}
I tried to go as follows $m \lt \lambda \implies 1/(\lambda + 1) \lt 1/(m+1) \lt 1$
Which then implies $\lambda/(\lambda + 1) \lt \lambda/(m+1) \lt \lambda$, But still I cant see how (**) can be obtained. Is my assumption that $m \lt \lambda $ wrong? If not I think if we take $\mu = \lambda = 1$ I guess the author's inequality fails? I think there is a something Im missing
It appears that there is mistake in the proof. But you only have to show that $x \in \mu E_{\lambda}$ for some $\mu$. All you have to do it take $ \mu >\frac {m+1} {\lambda}$ so that $\frac 1 {|\mu|} <\frac {\lambda} {m+1}$.