I am looking for a metric that would evaluate the distance between a sample $S$ and a density function $D$
Building a sample from a known distribution can be done using a monte-carlo sampling, however the invers operation is much harder. I am studying different methods, and impact of errors happening in those algorithms, therefore I need a distance between those two objects.
Maximum likelihood is not something that meets my needs. To show why let's take the exemple of a gaussian distribution centered on $0$. Maximum likelihood whould be achieved by having all my point at point $0$, any point in a different place would correspond to a lower probability and therefore reduce the total likelihood.
For me, minimum distance should be achieved if for all interval $I'$ of my domain $I$ I have something like $$\int_{I'} D(x) \mathrm dx = \frac{\|\{x\in S | x\in I'\}\|}{\|S\|}$$ That means that however low the density is in an interval, if the interval is large enough (or if my sample contains enough point) one or more point are expected to be in it.
Is there a metric that would have the property i'm looking for ?
How about taking the Kullback-Leibler divergence between the empirical distribution (of the sample), and the target distribution?