Unconditional convergence implies convergence of sum of norm squares in hilbert space

148 Views Asked by At

Let $(H, (.,.))$ be a Hilbertspace and consider the definition of unconditional convergence, i.e. for a set $I \neq \emptyset$ and a family of vectors $(x_i)_{i \in I}$ in $H$ the series $$ \sum_{i \in I}x_i$$ is said to be unconditional convergent to a value $x \in H$, if for any $\epsilon > 0$ there is a finite $I_0 \subseteq I$ such that $$ \left\lVert \sum_{i \in \tilde{I}}x_i - x\right\rVert<\epsilon$$ for all finite $\tilde{I} \supseteq I_0$.

Using this definition I wanted to prove that the unconditional convergence of a series $ \sum_{n \in \mathbb{N}} x_n$ in $H$ implies the convergence of $\sum_{n=1}^{\infty} \lVert x_n \rVert^2$.

Using unconditional convergence of $ \sum_{n \in \mathbb{N}} x_n =\colon y$ and the fact that the inner product is linear and continuous in each argument we deduce that

$$(y,y) = (\sum_{n \in \mathbb{N}}x_n, \sum_{m \in \mathbb{N}}x_m) = \sum_{n \in \mathbb{N}}\sum_{m \in \mathbb{N}}(x_n, x_m)$$

and analogously for the sums interchanged. Here, the sums on the right side converge unconditionally. From there I want to infer, and I don't know how to do that step, that $$ \sum_{(n,m) \in \mathbb{N} \times \mathbb{N}}(x_n,x_m)$$ converges unconditionally, which would then lead me to the unconditional convergence of $$ \sum_{(n,m) \in \mathbb{N} \times \mathbb{N}}\lvert (x_n,x_m) \rvert.$$

Hence we get $$\sum_{n=1}^{\infty} \lVert x_n \rVert^2 = \sum_{n \in \mathbb{N}}(x_n, x_n) \leq \sum_{(n,m) \in \mathbb{N} \times \mathbb{N}}\lvert (x_n,x_m) \rvert.$$ Thus my question is if is this is correct and if so, how can the step which I highlighted with "I don't know how to do" be justified?

Thanks in advance.