Probability - Characterizing goodness of moment matching method.

119 Views Asked by At

I have a question about how to characterize the goodness of approximating a distribution using its moments. Suppose I have a probability density function $p(x)$ (e.g., normal distribution), and I am interested in calculating $E[f(X)]$, but instead of using $p(x)$ directly, we use another distribution $q(x)$ (say, the binomial distribution) such that it has the same first two moments as $p(x)$. I am interested in calculating the difference (intuitively, this is the "absolute error"):

$$ \int f(x) \left[p(x) - q(x) \right] dx, $$

If the approximation works, this quantity should be small compared to $\int f(x)p(x) dx$, which is the "true" answer we are looking for.

What is the best way to do this? One way I try to attack this problem is via characteristic functions/Fourier transforms (since we use moment matching, the first two terms cancel out), but the inverse transform is problematic, because we need to inverse transform polynomials, which makes the integral divergent.

I feel that there must be ways we can attack this problem generally, without really specifying what $p(x)$ and $q(x)$ are (all we know is that their first two moments match, and intuitively $p(x)$ and $q(x)$ should look similar in some sense).