During a discussion about linear hermitian operators, my professor claimed that if a linear operator $M$ is hermitian under a certian set of conditions, then genrally any function that fulfills these conditions can be expressed as an infinite sum of the operator's eigenfunctions.
However, he did not prove it, as it is a "math for physicists" course which is quite informal. I am intrested in the proof.
We proved a similar theorem once in a linear algebra course (the set of an hermitian $(n\times n)$ matrix's eigenvectors spans the space), but the proof was based on mathematical induction on the space's dimensions, and here we have infinite dimensions.
The correct rigorous version of this is the Spectral Theorem.
I don't know which "certain set of conditions" the professor had in mind. In general self-adjoint operators on an infinite-dimensional Hilbert space might have continuous spectrum, in which case there are no eigenfunctions at all (but a physicist might speak of "generalized eigenfunctions" that are not members of the Hilbert space, and the "sum" will become an integral).