Approximate representation of functions using Dirac delta

439 Views Asked by At

I have seen some physics books giving an approximate representation, $f_{\text{appx}}(x)$, of a function $f(x)$ where,

$$f_{\text{appx}}(x) =\sum_{n=1}^{N} \alpha_n \delta(x-x_n)$$

and

$$\alpha_n = \int_{\infty}^{} f(x)\delta(x-x_n)\, dx=f(x_n)$$

which appears to me that the Dirac delta, $\delta(x-x_n)$, is being used as the basis functions. I have a few questions regarding this.

  1. Wouldn't the summation $\sum_{n=1}^{N} \alpha_n \delta(x-x_n)$ become $+\infty$ when $x=x_n$?.
  2. The approximated function $f_{\text{appx}}(x) $ doesn't seem to be square integrable and the norm of the error i.e. $\left \| f(x)-f_{\text{appx}}(x) \right \|$ would be infinite. In this case how is the approximation valid?.
2

There are 2 best solutions below

0
On

In looking at Griffiths' book Introduction to Quantum Mechanics, 1st Ed., p. 52-53, he talks about the fact that scattering states aren't normalizable; perhaps your physics book is talking about scattering states?

Perhaps more relevantly, on pages 101 and 102 of the same, Griffiths uses the delta functions as a basis because they are the eigenfunctions of the position operator, and they're complete. He has the following footnotes on page 102 that might clarify for you what's going on:

21 We are engaged here in a dangerous stretching of the rules, pioneered by Dirac (who had a kind of inspired confidence that he could get away with it) and disparaged by von Neumann (who was more sensitive to mathematical niceties), in their rival classics (P. A. M. Dirac, The Principles of Quantum Mechanics, first published in 1930, 4th Ed., Oxford (Clarendon Press) 1958, and J. von Neumann, The Mathematical Foundations of Quantum Mechanics, first published in 1932, revised by Princeton Univ. Press, 1955). Dirac notation invites us to apply the language and methods of linear algebra to functions that lie in the "almost normalizable" suburbs of Hilbert space. It turns out to be powerful and effective beyond any reasonable expectation.

22 That's right: We're going to use, as bases, sets of functions none of which is actually in the space! They may not be normalizable, but they are complete, and that's all we need.

0
On

It is easy to imagine that any distribution can be represented in the form $$ f(x) = \sum_{n = 1}^\infty f(x_n) \delta(x - x_n), $$ where $$ \bigcup_{n = 1}^\infty x_n = {\rm supp}(f). $$ Indeed, if the support of $f$ is represented in that form, then for any compactly supported $\varphi$ with ${\rm supp}(\varphi) \subseteq {\rm supp}(f)$, we have $$ (f, \varphi) = \int_{-\infty}^\infty f(x) \varphi(x) dx = \sum_{n = 1}^\infty f(x_n) \varphi(x_n). $$ In this context, the function $$ f_{\rm appx}(x) = \sum_{n = 1}^N f(x_n) \delta(x - x_n) $$ is the partial sum of $f$, thus $f$ can be approximated by $f_{\rm appx}$.

Regarding your questions. From mathematics point of view, the answer to your first question is- no. You have to treat the representation of $f_{\rm appx}$ in the sense of distributions, i.e., for the above function $\varphi$, $$ \int_{-\infty}^\infty f_{\rm appx}(x) \varphi(x) dx = \sum_{n = 1}^N \alpha_n \varphi(x_n) = \sum_{n = 1}^N f(x_n) \varphi(x_n), $$ which is a partial sum of $(f, \varphi)$.

Since you deal with distributions and not proper functions, instead of $|| f - f_{\rm appx} ||$, you should consider the norm (e.g., in $l^p$) $$ || (f, \varphi) - (f_{\rm appx}, \varphi) || = \left|\left| \int_{-\infty}^\infty f(x) \varphi(x) dx - \int_{-\infty}^\infty f_{\rm appx}(x) \varphi(x) dx \right|\right| = \left|\left| \int_{-\infty}^\infty f(x) \varphi(x) dx - \sum_{n = 1}^N f(x_n) \varphi(x_n) \right|\right| = \left|\left| \sum_{n = N}^\infty f(x_n) \varphi(x_n) \right|\right|. $$ As you see, this norm can be quite large if you consider small values for $N$.

I would recommend you the useful book by Teodorescu, Kecs and Toma "Distribution Theory: With Applications in Engineering and Physics", where the usage of distributions in physics and mechanics is mathematically justified.