Is there a meaningful way to approximate a discrete random variable?

590 Views Asked by At

Is there a meaningful way to find a continuos approximation of a discrete random variable?

Thoughts for the $L^2$ case

If $X \in L^2$, then we may want to consider the subspace $V = C^1 \cap L^2$ and try to project the discrete density $P^X$ onto the space $V$. If $V$ is closed then there should exists unique this projection. Does this make sense? Is there a way to actually carry on the computations?

Anyhow

I am specifically interested in the random variable $X$ with distribution $P(X = 2^k) = \frac 1{2^{k+1}}$, which is not in $L^1$, therefore not in $L^2$. The naive way to make this continuos is to set $k = \log_2 x$, and get $P(X=x) = \frac 1{2x}$. Differentiating we get the "density" $f(x) = -\frac 1{2x^2}$. This of course it's not a density as it's not positive; plus, it's not even integrable. It resembles however the cauchy density. Is there a way to make this argument precise and to make sense of all this?

I'm interested in some literature on this topic if there is or an explanation of why this never makes sense would be also nice :)

Thank you!

2

There are 2 best solutions below

0
On BEST ANSWER

Yes, there is. It is known as the Weiner–Askey Polynomial Chaos.

In short, a random variable with finite second moment can be approximated by a random variable with a distribution of our choosing, provided that it is also of finite second moment.

The essence of the technique is to decompose the random variable into an infinite sum of orthogonal polynomials of our chosen random variable. Several orthogonal polynomials are orthogonal wit respect to probability measures of well known distributions, e.g. the hermite polynomials are orthogonal (up to a scaling factor) with respect to the gaussian distribution. We can exploit this to compute a set of deterministic coefficients for a truncated approximation.

So in essence, we can do such a thing. Now, I don't know if your distribution has finite variance, if not then this doesn't apply. But if so, it could be done.

I am on mobile at the moment so it is hard for me to pull references, but if you search my past answers you can get some very accessible papers.

1
On

If I was aiming for a continuous approximation to what looks like a St Petersburg distribution, I might look at the cumulative distribution function for the discrete distribution of $P(x \le 2^n) = F_d(2^n)= 1-\frac{1}{2^{n+1}}$. This would imply $F_d(k2^{n+1})= 1-\frac{1}{2^{n+1}}$ for $\frac12 \lt k \lt 1$.

I would then suggest a cumulative distribution function for the continuous approximation of $F_c(x)=1-\dfrac{k}{x}$, which would give a density of $f_c(x)=\dfrac{k}{x^2}$ for $x \gt k$.

This will give the main features you are looking for: the probability of being between $2^n$ and $2^{n+1}$ is double the probability of being between $2^{n+1}$ and $2^{n+2}$, and so the expectation is infinite.

All that remains is to choose a value for $k$ for a reasonable approximation. Possibilities might include $\dfrac12$, $1$, $\sqrt{\dfrac12}$ and $\dfrac{1}{\log_e(4)}$, with the first two in a sense providing bounds on the cumulative distribution and the last two having some possibly convenient properties, but in the end the choice is arbitrary and may not be very important.