Complete basis for space of summable monotonically decreasing sequences

736 Views Asked by At

I'm trying to extract a response function out of some input and output data. This response function is a sequence of values $r_n$ with $n\in\mathbb{N}$. There is a good reason why $r_n$ should be monotonically decreasing and that it is summable, meaning that the sum $\sum_{i\in\mathbb{N}}r_i$ exists.

Now, if I have a complete basis for this space of sequences $\{b^1,b^2,b^3\dots\}$, than I can compose $(r_i)=a_1b^1+a_2b^2+\dots$ and use linear fitting on my data to find $a_i$.

I am not a mathematician and I would appreciate any help in finding a complete basis for space of monotonically decreasing summable sequences. How does one know that the basis is complete? My naive guess was a basis $b^k=(e^{-ki})_{i\in\mathbb{N}}$. Is this basis complete?

Edit: For clarification, components $a_i$ are nonnegative. If this addition makes this space something else than vector space (module maybe), is that problematic for existence of basis?

1

There are 1 best solutions below

0
On BEST ANSWER

First, let me explain the difference of a vector space and what you are considering by giving an example which is related to your set of sequences.

The set $c$ of convergent sequences is a vector space (or a module) over the real numbers $\mathbb{R}$, that is, we have an addition (which is defined by $(x_n)_n + (y_n)_n = (x_n + y_n)$ for $(x_n)_n$, $(y_n)_n \in c$) and a scalar multiplication (which is defined by $\lambda (x_n)_n = (\lambda x_n)_n$ for $(x_n)_n \in c$ and $\lambda \in \mathbb{R}$) and these satisfy certain properties, for example, $+$ is associative, commutative, the sequence which is constant $0$ is a neutral element for $+$ and for every sequence $x$ there exists $y$ with $x + y = 0$.

Note also that in a vector space you only can take finite sums. You need a concept of convergence to talk about infinite sums. In the case of $c$, we can define a norm $\| \cdot \|$ by setting $$\|x\| = \sup\{x_n \:|\: n \geq 1\}$$ and then we can say that a sequence $(x(n))_n$ of sequences $x(n)$ converges to a sequence $x$ if and only of $\|x(n) - x\| \to 0$ for $n \to \infty$.

This way $c$ becomes a normed vector space and we can talk define a basis of $c$ to be a set $\{b^1, b^2, \dots\}$ such that every element $x \in c$ can be (uniquely) written as $x = \sum_{i}^{\infty} a_i b^i$ where the $a_i$ are real numbers and the infinite sum means that $x$ is the limit of the sequence of sequences $(\sum_{i = 1}^N a_i b^i)_{N}$. An example of such a basis is given by $\{e_1, e_2, \dots\}$ where $e_k$ is has $1$ as its $k$th entry and is $0$ everywhere else. In fact every $x \in c$ can be written as $x = \sum_{k = 1}^\infty x_k e_k$.

On the other hand let us consider the set $m$ of monotonically decreasing sequences $x$ for which $\sum_{k = 1}^\infty x_k$ exists. This last condition implies that $m$ is a subset of $c$ and one sees easily that the addition of two sequences in $m$ again gives a sequence in $m$. However, the additive inverse of a non-zero element in $m$ is not in $m$. Also the scalar multiple $\lambda x$ of a non-zero element $x \in m$ is only in $m$ if $\lambda$ is non-negative. Thus we do not have a vector space structure on $m$.

Still, we can add in $m$, we can take non-negative multiples and we can use the norm $\|\cdot\|$ to make sense out of infinite sums. We can now define a 'basis' for $m$ to be a set $\{b^1,b^2,\dots\}$ of elements of $m$ such that every $x \in m$ has can be (uniquely) written as $x = \sum_{i = 1}^\infty a_i b^i$ where the $a_i$ are now positive real numbers.

Let us show that such a 'basis' exists.

The sequence $E_N = \sum_{k = 1}^{N} e_k$ is the sequence which has a $1$ in the first $N$ entries and $0$ everywhere else. In particular $E_N$ is an element of $m$. Note that we have $e_N = E_N - E_{N-1}$ for $N > 1$ and $e_1 = E_1$. Let us show that $\{E_1, E_2, \dots\}$ is a 'basis' for $m$.

Let $x = (x_k)_{k} \in m$. Note that we have \begin{align} x = \sum_{k = 1}^\infty x_k e_k &= x_1 E_1 + \sum_{k = 2}^\infty x_k (E_k - E_{k-1}) \\ &= \sum_{k = 1}^\infty x_k E_k - \sum_{k = 2}^\infty x_{k} E_{k-1} \\ &= \sum_{k = 1}^\infty x_k E_k - \sum_{k = 1}^\infty x_{k + 1} E_{k} = \sum_{k = 1}^\infty (x_k - x_{k + 1}) E_k \end{align} For the third equality sign we have to make sure that the infinite sums $\sum_{k = 1}^N x_k E_k$ and $\sum_{k = 1}^N x_k E_{k-1}$ make sense. However, we have $$\sum_{k = 1}^N x_k E_k = \sum_{k = 1}^N x_k \sum_{l = 1}^k e_l = \sum_{l = 1}^N (\sum_{k = l}^N x_k) e_l$$ And thus for $N \to \infty$ we obtain that $\sum_{k = 1}^N x_k E_k$ converges to the sequence $(\sum_{n = k}^\infty x_n)_k$. A similar calculation shows that $\sum_{k = 2}^\infty x_{k} E_{k -1}$ makes sense.

As we see now from the first calculation, $x$ can be written as a non-negative linear combination of the $E_k$ (by assumption $x_k - x_{k+1}$ is non-negative), so $\{E_1, E_2, \dots, \}$ is a 'basis' of the desired form.

Note that $b^{k} = \{e^{-ki}\}_i$ does not work, since for example the $E_k$ cannot be represented as a nonnegative linear combination of these sequences. In fact, it is quite easy to see that any 'basis' has to consist of sequences which become $0$ eventually.