There's plenty of literature about function approximation, both uniform and pointwise. Moreover, there are typically results on the speed of convergence of a given basis to the approximated function assuming some bound on its variability, e.g., its modulus of continuity.
I'd like references to similar theory for sequence approximation. I'm looking for references on known bases of sequences $\{a_{i, 1}\}_{i=1}^\infty$, $\{a_{i, 2}\}_{i=1}^\infty$, $\{a_{i, 3}\}_{i=1}^\infty$, $\dots$ such that any sequence $\{b_i\}_{i=1}^\infty$ in some family of 'well-behaved' sequences, e.g., square summable, can be well-approximated a linear combination of the basis elements. Moreover, some results on the rate of approximation as a function of some notion of 'variation' of the approximated sequence.
Can anybody point me to some well-known results, or even buzz-words I should look for?
P.S. I'm aware of the results of approximating on the half-line through Chebyshev rational functions and Legendre rational functions. So, in theory, I could approximate a sequence by just discretizing the corresponding rational functions at $x = 1, 2, \dots$. But it's not clear that it's the best we can do. For example, uniform convergence for functions does not imply any notion for the resulting sequence. Moreover, sequence "variability" is different than that of a continuous function sharing the same formula. I'm hoping that the discrete nature of sequences led to some specialized results.