Suppose that for all $k\in\mathbb{N}$ we have $$\sum_{i=-\infty}^{\infty}a_{ik}x_i=e_k$$ for the complex numbers $a_{ik}$, $e_k$. Let $A$ be the infinite matrix whose rows are the infinite vectors $$(\dots\textrm{ }\textrm{ } a_{-2,k} \textrm{ }\textrm{ }a_{-1,k}\textrm{ }\textrm{ } a_{0,k}\textrm{ }\textrm{ } a_{1,k}\textrm{ }\textrm{ } a_{2,k}\textrm{ }\textrm{ } \dots),$$ and suppose that all these vectors are linearly independent.
I need to express the $x_i$ numbers by the $e_k$ numbers, so I have to solve this infinite system of infinite linear equations - or at least prove that it has a solution.
My attempt is to find the inverse of $A$ and prove that the rows of the inverse form convergent series.
My questions are:
Are there any general theorems that I could use in this situation to prove the existence of a solution?
Can one define the "inverse" of such an infinite matrix? Is the linear independence of the rows sufficient for the existence of an "inverse", such that multiplied by the row vector of the $e_k$ numbers give the desired $x_i$ numbers? For example if I take the first $2n$ rows and the "middle" $2n$ coloumns, I obtain a square matrix that I can find the inverse of. If I calculate this inverse for all $n\in\mathbb{N}$, then does the limit of this sequence of matrices give such a matrix?
This kind of question is mostly easily answered if you think of the "matrix" as an operator acting on a Hilbert space (say, functions from $H^1$) where the Hilbert space has a countable basis. In that case, the elements of the vectors and the matrix are simply the element of the Hilbert space expanded in the basis you chose, and the "matrix" elements just say how the operator operates on the basis.
If you see this kind of connection, then you can use the entire machinery from functional analysis that deals with questions of invertibility of operators.