Lemma 2.4-1 in Erwin Kreyszig's "Introductory Functional Analysis with Applications": Is there an easier proof?

509 Views Asked by At

Here's the statement:

Let $\{x_1, \ldots, x_n \}$ be a linearly independent set of vectors in a normed space $X$ (of any dimension). Then there is a real number $c > 0$ such that for every choice of scalars $\alpha_1, \ldots, \alpha_n$, we have $$\Vert \alpha_1 x_1 + \ldots + \alpha_n x_n \Vert \geq c (\lvert\alpha_1\rvert + \ldots + \lvert\alpha_n\rvert).$$

Now although I understand Kreyszig's proof, I find it to be not-so-clean at least as far as notation goes. So I wonder if there could be a cleaner and easier proof not requiring too many pre-requisites?

Or, by suitably modifying the notation, is there a better way of presenting Kreyszig's proof?

3

There are 3 best solutions below

0
On BEST ANSWER

Everything boils down to just a few key facts:

  • A non-vanishing continuous real function on the unit sphere (unit vectors) $S$ in $\mathbb{C}^{n}$ achieves its minimum (non-zero) value at some point.
  • The Cauchy-Schwarz inequality for inner products, $|(x,y)| \le \|x\|\|y\|$, of which the following is a special case: $|\sum_j a_j b_j| \le (\sum_j |a_j|^{2})^{1/2}(\sum_j|b_j|^{2})^{1/2}$.
  • Reverse triangle inequality for a norm: $|\,\|x\|-\|y\|\,| \le \|x-y\|$.

Proof of your Theorem: The function $F(\alpha)=\|\alpha_1 x_1+\alpha_2 x_2+\cdots+\alpha_n x_n\|$ is continuous from $\mathbb{C^{n}}$ to $\mathbb{R}$. This follows from the reverse triangle inequality and the Cauchy-Schwarz inequality: $$ \begin{align} |F(\alpha)-F(\alpha')| & \le |\;\|\sum_j\alpha_j x_j\|-\|\sum_j\alpha_j'x_j\|\;| \\ & \le \|\sum_j \alpha_j x_j - \sum_j \alpha_j' x_j\| \\ & \le \sum_j |(\alpha_j-\alpha_j')|\|x_j\| %% \\ & \le (\sum_j |\alpha_j-\alpha_j'|^{2})^{1/2}(\sum_{j}\|x_j\|^{2})^{1/2} \\ \le C\|\alpha-\alpha'\|_{\mathbb{C}^{n}}. \end{align} $$ The function $F$ is $0$ iff $\alpha=0$ because of the linear independence of the vectors $\{ x_j \}$. So $F(\alpha)$ is non-vanishing continuous function on the unit sphere $S = \{ \alpha : \|\alpha\|=1\}$ and, therefore, must have a minimum at some $\alpha_{0} \in S$. That is, $F(\alpha) \ge F(\alpha_{0}) > 0$ for all $\alpha \in S$.

For any $\alpha \in \mathbb{C}^{n}$ and $\rho > 0$, one has $F(\rho\alpha)=\rho F(\alpha)$. Therefore, if $\alpha \ne 0$, and $k=F(\alpha_{0})$, $$ F(\alpha) = \|\alpha\| F\left(\frac{1}{\|\alpha\|}\alpha\right) \ge k\|\alpha\|. $$ Using the Cauchy-Schwarz inequality one more time $$ \sum_{j}|\alpha_j| \le (\sum_j|\alpha_j|^{2})^{1/2}(\sum_j 1^{2})^{1/2}=\sqrt{n}\|\alpha\| \le \frac{\sqrt{n}}{k}F(\alpha), %% \\ \frac{k}{\sqrt{n}}\sum_{j}|\alpha_j| \le F(\alpha). $$ Setting $c=k/\sqrt{n}$ gives the stated inequality $$ c(|\alpha_1|+|\alpha_2|+\cdots+|\alpha_{n}|) \le \|\alpha_1 x_1+\alpha_2 x_2+ \cdots\alpha_n x_n\|. $$

0
On

I think there is a much easier proof if you know some linear algebra.

As in the text, it's enough if we show that

$$||\beta_1 x_1 + \beta_2 x_2 + \ldots + \beta_n x_n|| \geq c $$ where $\sum_{j=1}^{j=n} |\beta_j| = 1 $ or $|\beta|_1 = 1$.

Let $X$ be the matrix whose nth column is $x_n$. Let $\beta$ be the vector whose nth entry is $\beta_n$ Then we want to show $||X \beta || \geq c$, where $||\beta||_1 = 1$.

Consider the singular value decomposition of $X$. There exist positive real scalars $s_1, \ldots, s_n$ and orthonormal bases $e_1, \ldots, e_n$ and $f_1, \ldots, f_n$ such that $$X \beta = s_1 \langle \beta, e_1\rangle f_1 + \ldots+ s_n \langle \beta, e_n\rangle f_n $$

Let $s_k = \min\{s_1, \ldots, s_n \}$. Since $f_1, \ldots, f_n$ are orthonormal, we \begin{align} ||X \beta||_2^2 &= s_1^2 \langle \beta, e_1\rangle^2 + \ldots+s_n^2 \langle \beta, e_n\rangle^2 \\ &\geq s_k^2 \left( \langle \beta, e_1\rangle^2 + \ldots +\langle \beta, e_n\rangle^2 \right) \end{align}

Since $e_1, \ldots, e_n$ are orthonormal, $\left( \langle \beta, e_1\rangle^2 + \ldots +\langle \beta, e_n\rangle^2 \right) = || \beta ||_2^2$. If $||\beta||_1 = 1$, then $||\beta||_2 \leq 1$. This implies: $$ \begin{align} ||X \beta||_2 &\geq s_k \end{align} $$

(I've realized that I've implicitly assumed that each $x_j$ are n-dimensional. For other dimensions, the argument should only change trivially.)

0
On

If the reader was first introduced to the concept of equivalence of norms, then he may easily spot why $S = \{ \alpha \in \mathbb{C}^n : ||\alpha||_1 = 1 \}$ is a compact set w.r.t the standard norm of $\mathbb{C}^n$. In this case, the analytic proof given above can be simplified.

Indeed, $F$ is continuous for the same reason as before, but, now, the existence of a minimal $\alpha_0$ gives $c$ directly:

$$\forall \alpha \in A: ||\alpha_1x_1 + \alpha_2x_2 + ... + \alpha_nx_n|| \geq F(\alpha_0)$$