Example of a vector set that is ω-independent but not minimal

87 Views Asked by At

I'm stuck proving the statement below. The point is that minimality of a vector set implies $\omega$-independence (see definitions below), but the converse is not true in general. Both concepts are generalizations of linear independence of vectors. The statement below is an example in which $\omega$-independence does not imply minimality.

Let $\mathcal{H}$ be a Hilbert space. Further, assume that $(e_k)_{k=1}^{\infty}$ is an orthonormal basis for $\mathcal{H}$.

$\underline{To\space Show:}$

The set $\{e_1\} ∪ \{e_k + e_{k+1}: k \in \mathbb{N}\}$ is $\omega$-independent, but not minimal.

$\underline{Definition:}$

Let $X$ be a Banach space and let $(\varphi_i)_{i\in I} \subset X$. Then $(\varphi_i)_{i\in I}$ is

  • $\underline{\omega-independent}$ if whenever $\sum_{i\in I}c_i\varphi_i$ is convergent and equal to zero for some $(c_i)_{i\in I} \subset \mathbb{K}$, then $c_i = 0 \space\space\forall i \in I$.
  • $\underline{minimal}$ if $\varphi_j \notin \overline{span\{\varphi_i: i \neq j \}}\space\space \forall j \in I$.

My attempt at the proof looks like this, but I'm stuck:

$\underline{Proof:}$

First, we'll have to show that the set above is indeed $\omega$-independent. Afterwards, we'll show that it is not minimal.

Let $(c_k)_{k\in \mathbb{N}} \subset \mathbb{K}$ such that

$$ c_1 e_1 + \sum_{k\in \mathbb{N}}c_{k+1} (e_k + e_{k+1}) = 0. $$

Since $(e_k)_{k=1}^{\infty}$ is an ONB we can break this sum up, so this is equivalent to

$$ c_1 e_1 + \sum_{k\in \mathbb{N}}c_{k+1} e_k + \sum_{k\in \mathbb{N}}c_{k+1} e_{k+1} = 0. $$

$$ \iff \sum_{k\in \mathbb{N}}c_{k+1} e_k + \sum_{k\in \mathbb{N}}c_k e_k = 0. $$

$$ \iff \sum_{k\in \mathbb{N}}(c_k + c_{k+1}) e_k = 0. $$

$$ \implies c_k + c_{k+1} = 0 \space\space \forall k \in \mathbb{N}.$$

$$ \iff c_{k+1} = - c_k \space\space \forall k \in \mathbb{N}.$$

At this point I'm stuck: This means the $c_k$ don't have to be all zero, so I must have made a mistake somewhere. Any help is appreciated.

1

There are 1 best solutions below

6
On BEST ANSWER

I set $g_1 = e_1$ and $g_n = e_{n-1} + e_{n}$.

You have found out that if $\sum c_n g_n = 0$, then $c_k = -c_{k+1}$. Set $c= | c_k|$, then this implies that either $c = 0$ or that $\sum_k c_k g_k$ is not converging to $0$. Indeed, since $\sum_{k=1}^m c_k g_k = c_m e_{m}$ we have that $\|\sum_{k=1}^m c_k g_k\| = c$ for all $m$, i.e., no convergence to $0$ if $c\neq 0$. One problem is, that $(g_n)_{n\in \mathbb N}$ is also minimal as far as I can see. Hence, I give a reference to a working example below.

You can have a look at Section 6 of Christopher Heil's "A Basis Theory Primer" and, in particular, Theorem 6.2. and the following Example 6.4. It is shown that minimality implies $\omega$-independence, but $\omega$-independence does not imply minimality, just as you request.

The example of a sequence that is $\omega$-independent but not minimal is very similar to your approach. The example is based on the sequence $(f_n)_{n\in \mathbb N}$, with $f_1 = e_1$ and $f_n = e_1 + e_n/n$, where $(e_n)_{n\in \mathbb N}$ is an ONB.