In my linear algebra course I stumbled upon the following observations.
We have some function $f: \Bbb{R} \to \Bbb{R}$, $f = f(x)$.
$f(x)$ may be composed of elementary functions or not, but in either case, we can express $f(x)$ in any basis we want.
If we express $f(x)$ in the following basis:
$$\{1, x, x^2, x^3, \ldots\}$$
We get the coefficients of the Taylor series of $f(x)$.
If we express $f(x)$ in the following basis:
$$\{1, \sin x, \cos x, \sin {2x}, \cos {2x}, \ldots\}$$
We get the coefficients of the Fourier transform of $f(x)$.
I have two questions.
1
If we express $f(x)$ in the following basis:
$$\{1, x, \frac12(3x^2 - 1), \frac12(5x^3 - 3x), \ldots\} =\ \text{Legendre polynomials}$$
Do the coefficients tell us anything useful?
2
Other than the Taylor series basis and the Fourier transform basis, have there been other famous polynomial bases in mathematics? What were they used for?
I tried looking online and on this site but couldn't find any good information.
As a remark, these are not algebraic bases. But you can talk about orthogonal bases in a Hilbert space, as you alluded to.
For example, if you look at weighted inner products, you get some $L^2$ spaces and "orthogonal polynomials" (like Legendre's polynomials). They can be used for Gaussian quadrature to compute numerical approximations of integrals.