I have some more general questions that are more of intuitive nature regarding Operators used in Functional Analysis.
- How come anyone ever came up with the concept of Adjoint Operators? Does it arise somewhat naturally when solving "practical" problems (some integrals)?
- Same question for Compact Operators basically. Is the definition intuitive in the sense that it creates the beautiful results we get from Compact Operators? I am talking about the definition that every bounded sequence $x_{n}$, the sequence $Tx_{n}$ contains a converging subsequence (why not for example say that $Tx_{n}$ should be ). There must be some explanation why it is defined like that.
I would appreciate some references where I can read more into this stuff.
Thanks!
Integration by parts led to the adjoint. For example, if $L=af''+bf'+cf$ is a linear differential operator, then you can integrate by parts to obtain the Lagrange adjoint $L^{\dagger}$: \begin{align} \int (Lf)g dx &= \int (af''g+bf'g+cfg)dx \\ &= af'g+bfg+\int (-f'(ag)'-f(bg)'+cfg) dx\\ &= af'g+bfg-f(ag)'+\int (f(ag)''-f(bg)'+cfg)dx \\ &= af'g+bfg-f(ag)'+\int f (L^{\dagger}g) dx \end{align} where $L^{\dagger}g = (ag)''-(bg)'+cg$. The above may be written as $$ (Lf)g-f(L^{\dagger}g)=\frac{d}{dx}(af'g+bfg-f(ag)') $$ This led to reduction-of-order algorithms for solving ODEs. Also, if you integrate over an interval, and the evaluation terms vanish, then $$ \int_a^b(Lf)gdx = \int_a^{b}f(L^{\dagger}g)dx $$ This was used in studying Sturm-Liouville eigenvalue problems, which were associated with second-order self-adjoint operators. The infinite-dimensional techniques were then borrowed to study symmetric and Hermitian matrices.
Compactness of operators arose out of looking at integral operators where it was common for operators on finite domains to map uniformly bounded sets of functions to equicontinuous functions, and such techniques--especially those of Fredholm--led to proofs of existence of eigenvalues and eigenfunctions. Hilbert, and his students, most notably John von Neumann, adapted these techniques to define and study an abstract inner product space, over a century after the development of these core ideas. Such techniques were also borrowed to study adjoint matrices and symmetric and Hermitian matrices, after these infinite-dimensional techniques filtered down to finite-dimensional spaces. Fourier expansions were then adapted to define and study orthogonal eigenvector expansions associated with symmetric and Hermitian matrices; they already knew that the general theory would allow one to prove the existence of orthogonal eigenvector expansions for Hermitian and symmetric matrices.