In the literature of sequential Monte Carlo, the probability density function (PDF) of a random variable $X$ is often stated as equal to the expectation of the Dirac delta "function" shifted by $X$:
$$f_X(x)=\mathbf E[\delta(x-X)]$$
Here is a blog post explaining this.
Suppose that $X$ has a PDF. Then, the expectation of any function $g$ of $X$ is
$$\mathbf E[g(X)]=\int_{-\infty}^\infty g(x)f_X(x)dx$$
Let $g(x)=\delta(y-x)$ for some $y\in\mathbf R$. Then,
$$\mathbf E[\delta(y-X)]=\int_{-\infty}^\infty\delta(y-x)f_X(x)dx=f_X(y)$$
Since $y$ is arbitrary, this holds for all $y\in\mathbf R$.
We know that a real-valued function of a random variable is a random variable, so we may speak of its expectation. However, $\delta(x-X)$ is really a function that maps a random variable $X$ to a "function": $X\mapsto x\mapsto\delta(x-X)$. Strictly speaking, $\delta$ is not a even a function. How can we speak of the expectation of something that is not a random variable?
The empirical PDF of a sample $x_1,x_2,\ldots,x_n$ is defined (by "replacing the expectation with the average") as:
$$\frac1n\sum_{i=1}^n\delta(x-x_i)$$
Is this really a PDF? In my opinion, if such "functions" were accepted as "PDFs", then every random variable would have a "PDF".