Alternative to a proof in Kreyszig's functional analysis book?

37 Views Asked by At

In the book Introductory Functional Analysis (Erwin Kreyszig) there is the following result:

Theorem. Let $(X, \|\cdot\|)$ be a normed space and $\{e_1, \ldots, e_n \}$ be a linearly independent set. Then, there is a constant $c>0$ such that

$$\|\alpha_1 e_1+\ldots +\alpha_n e_n\|\geq c(|\alpha_1|+\ldots+|\alpha_n|)$$ for every $\alpha_1, \ldots, \alpha_n\in\mathbb K$.

The book is very good but at this point I particularly didn't like the way that the proof was written so I came up with an alternative proof. It goes as follows:

Proof. The assertion is equivalent to showing that there exists a constant $c>0$ such that

$$\|\beta_1 e_1+\ldots +\beta_n e_n \|\geq c$$ for every $\beta_1, \ldots, \beta_n\in\mathbb K$ such that $\displaystyle \sum_{j=1}^n |\beta_j |=1$. In fact, suppose there is $c>0$ such that

$$\|\alpha_1e_1+\ldots +\alpha_n e_n\|\geq c(|\alpha_1|+\ldots+|\alpha_n|)$$ for every $\alpha_1, \ldots, \alpha_n\in\mathbb K$. In particular, for $\alpha_1=\beta_1, \ldots, \alpha_n=\beta_n$ where $\sum_{j=1}^n |\beta_j|=1$ we have

$$\|\beta_1 e_1+\ldots+\beta_n e_n \|=\|\alpha_1 e_1+\ldots+\alpha_n e_n\|\geq c(\|\alpha_1|+\ldots+|\alpha_n|)=c(|\beta_1|+\ldots+|\beta_n|)=c.$$ On the other hand, suppose there exists $c>0$ such that $$\|\beta_1 e_1+\ldots+\beta_n e_n \|\geq c$$ for every $\beta_1, \ldots, \beta_n\in\mathbb K$ such that $\displaystyle \sum_{j=1}^n |\beta_j|=1$. Let $\alpha_1, \ldots, \alpha_n\in\mathbb K$. One may suppose $s:=|\alpha_1|+\ldots+|\alpha_n|\neq 0$ otherwise there is nothing to show. Taking: $$\beta_j:=\alpha_j/s$$ we have:

$$\sum_{j=1}^n |\beta_j|=\sum_{j=1}^n \frac{|\alpha_j|}{s}=\frac{1}{s}\sum_{j=1}^n |\alpha_j|=\frac{1}{s}s=1.$$ Hence, by hypothesis:

$$\frac{1}{s}\|\alpha_1 e_1+\ldots+\alpha_n e_n\|=\|\frac{\alpha_1}{s}e_1+\ldots+\frac{\alpha_n}{s} e_n\|=\|\beta_1e_1+\ldots+ \beta_n e_n \|\geq c$$ what implies $$\|\alpha_1 e_1+\ldots+\alpha_n e_n\|\geq cs=c(|\alpha_1|+\ldots+|\alpha_n|).$$

Finally, let us show the main result. By contradiction, suppose that for every $m\in\mathbb N$ there are scalars $\beta_1^{(m)}, \ldots, \beta_n^{(m)}$ such that $\sum_{j=1}^n |\beta_j^{(m)}|=1$ but

$$\|\beta_1^{(m)}e_1+\ldots+\beta_n^{(m)}e_n\|<\frac{1}{m}.$$ In particular, $y_m:=\beta_1^{(m)}e_1+\ldots+\beta_n^{(m)} e_n$ is a sequence of points of $X$ converging to $0$. Next, notice that $(\beta_1^{(m)}, \ldots, \beta_n^{(m)})_{m\in\mathbb N}$ is a sequence of points in the set $$S:=\{(x_1, \ldots, x_n)\in\mathbb K^n: \sum_{j=1}^n |x_j|=1\}.$$ But $S$ is compact since it is closed and bounded. In fact, $S$ is the preimage of the closed set $\{1\}\subset \mathbb R$ via the continuous map $(x_1, \ldots, x_n)\longmapsto \sum_{j=1}^n |x_j|$. The boundednes is direct. Consequently, $(\beta_1^{(m)}, \ldots, \beta_n^{(m)})_{m\in\mathbb N}$ has a subsequence $(\beta_1^{(m_k)}, \ldots, \beta_2^{(m_k)})_{k\in\mathbb N}$ converging to some $(\beta_1, \ldots, \beta_n)\in S$. But then $$y_{m_k}=\beta_1^{(m_k)}e_1+\ldots+\beta_n^{(m_k)}e_n$$ is a subsequence of $(y_m)_{m\in\mathbb N}$ which converges to $$y=\beta_1 e_1+\ldots+\beta_n e_n$$ when $k\to \infty$. Since, $(y_{m_k})_{k\in\mathbb N}$ is a subsequence of $(y_m)_{m\in\mathbb N}$ which has limit $0$, it follows $$y=\beta_1e_1+\ldots+\beta_n e_n=0.$$ Since $\{e_1, \ldots, e_n\}$ is linearly independent, it follows that $\beta_1=\ldots=\beta_n=0$, which contradicts $\sum_{j=1}^n |\beta_j|=1$.

Is my proof correct?

As a matter of fact, I believe the author didn't not want to use the compactness of $S$ so he uses Bolzano-Weierstrass in order to obtain the convergent subsequence. .