Orthogonal function expansions and eigenvalues

52 Views Asked by At

Consider expanding some function $f: [-1,1] \rightarrow \mathbb{R}$ in terms of say Legendre functions: $$ f(x) = \sum_{n=0}^{N} a_n P_n(x) $$ where the truncation limit $N$ is some large number sufficient to represent $f(x)$ to some desired accuracy. This can be written in terms of a vector dot product: $$ f(x) = {\bf p}(x)^T {\bf a} $$ where $$ \mathbf{p}(x) = \pmatrix{ P_0(x) \\ P_1(x) \\ \vdots \\ P_N(x)}, \qquad \mathbf{a} = \pmatrix{a_0 \\ a_1 \\ \vdots \\ a_N} $$ Now suppose I want to sample $f(x)$ at some discrete set of points $x \in X = \{x_0, x_1, \dots, x_M\}$, so that $$ \mathbf{f}(X) = \mathbf{P}(X) \mathbf{a} $$ where $$ \mathbf{f}(X) = \pmatrix{ f(x_0) \\ f(x_1) \\ \vdots \\ f(x_M)}, \qquad \mathbf{P}(X) = \pmatrix{ \mathbf{p}(x_0)^T \\ \mathbf{p}(x_1)^T \\ \vdots \\ \mathbf{p}(x_M)^T} \qquad $$ My questions are:

  1. If I chose $X$ to uniformly sample the interval $[-1,1]$ and let $M,N \rightarrow \infty$, do the eigenvalues of the matrix $\mathbf{P}$ converge to some values? Do there exist analytic expressions for these eigenvalues? What about eigenvectors? Note: Since the matrix $\mathbf{P}$ is $M \times N$, it might make more sense to talk about eigenvalues of $\mathbf{P}^T \mathbf{P}$, or singular values of $\mathbf{P}$ itself.

  2. Is it true that in the limit $M,N \rightarrow \infty$ that $\mathbf{P}$ becomes a linear operator acting on a Hilbert space?

  3. What if instead of Legendre functions I use other orthogonal functions, like Laguerre, Chebyshev, spherical harmonics, sines/cosines, etc. Are there any known results about the corresponding linear operators for those basis sets, such as eigenvalues/eigenvectors?

  4. Is there a branch of mathematics which studies these types of questions? If so what is it and can anyone recommend good references to learn more about this?