Eigenvalues and orthogonal function expansions

72 Views Asked by At

Suppose I have a function $f: [-1,1] \rightarrow \mathbb{R}$ expanded in terms of Legendre polynomials, $$ f(x) = \sum_{n=0}^{\infty} a_n P_n(x) $$ and I evaluate this function at a large number of points $x_i$ in $[-1,1]$ which I store in a vector $\mathbf{f}$ (so $\mathbf{f}_i = f(x_i)$). Then I could define a matrix equation $$ \mathbf{f} = \mathbf{P} \mathbf{a} $$ with $\mathbf{P}_{ij} = P_j(x_i)$ and $\mathbf{a}_j = a_j$. I believe that $\mathbf{a}$ would then be a vector in an infinite dimensional Hilbert space and $\mathbf{P}$ would be an operator on this space, which would eat the coefficient vector $\mathbf{a}$ and reproduce the function $f$ on $[-1,1]$.

My question is: can the eigenvalues / eigenvectors of the operator $\mathbf{P}^T \mathbf{P}$ be determined analytically? Equivalently, can we find the singular values / singular vectors of $\mathbf{P}$ analytically?

This problem could be investigated numerically, by picking $N$ random coefficients $a_n$, picking $M$ random values $x_i$ on $[-1,1]$ and computing the $M$-vector $\mathbf{f}$, and the $M$-by-$N$ matrix $\mathbf{P}$ and numerically calculating its singular values. If $M$ and $N$ are chosen to be sufficiently large, I imagine these singular values / eigenvalues would converge to something. I am asking whether an analytic expression can be found for what they converge to as $M,N \rightarrow \infty$.