Are all probability density functions described by their mean and variance?

470 Views Asked by At

The question might be trivial, but I would like someone to correct me or confirm it.

I know that the Normal (Gaussian) distribution is completely determined by its mean and variance, but does that hold for any other distribution? I assume that the answer is no. I could notice this is true for many distributions, but there are some exceptions. For example, mean and variance are undefined for the Cauchy distribution.

3

There are 3 best solutions below

2
On

Fix $m \in \mathbb{R}$ and $\sigma^2>0$. For any $p \in (0,1]$, define random variable $X$ by $$X = \left\{ \begin{array}{ll} m &\mbox{ with prob $1-p$} \\ m+ \frac{\sigma}{\sqrt{p}} & \mbox{ with prob $p/2$} \\ m-\frac{\sigma}{\sqrt{p}} & \mbox{ with prob $p/2$} \end{array} \right.$$ Then $X$ has mean $m$ and variance $\sigma^2$. The probability distribution for $X$ is different for each $p \in (0,1]$. Hence, there are an infinite number of probability distributions that give rise to mean $m$ and variance $\sigma^2$.

0
On

Sure, it holds for other distributions. Usual examples beyond the Normal, are the Exponential and the Poisson. The only parameter in both distributions is defined by the mean and variance (that are related in both cases). For the Chi-squared distribution, the existing parameter, called "degrees of freedom", is the the mean, and the variance is twice the value of this parameter.

But this property does not hold for all (or most) distributions, as you correctly stated. Some distributions don't even have finite expected value and/or variance defined. You can easily check, for instance, that a simple distribution such as $$f(x)=\frac{1}{x^2}\ \ \text{for}\ \ x\in[1,\infty), 0\ \ \text{otherwise},$$ does not have finite expected value, as the integral used to compute it does not converge.

In some cases, expressing the distribution with its parameters being the expected value and variance (or standard deviation) might require some thinking. For instance, you can express the Uniform distribution, more often expressed by just stating the limits $a$ and $b$, by $$f(x)=\frac{1}{2\sigma \sqrt{3}}\ \text{for}\ \ x\in [\mu-\sigma\sqrt{3},\mu+\sigma\sqrt{3}]\ \text{and}\ 0\ \text{otherwise}.$$ An interesting case is this: if $X$ has Log-Normal($\mu$,$\sigma$) distribution, the parameters $\mu$ and $\sigma$ which appears in the usual representation are not the expected value and standard deviation for $X$. They are the expected value and variance for $Y=\ln X$. In this case, however, you can express $\mu$ and $\sigma$ in terms of $E(X)$ and $V(X)$.

0
On

Check out any family of distributions which is parametrized by at least three parameters, e.g.

$$f_{a,b,c}(x)=N(a,b,c)\cdot (ax^2+bx+c)$$

for $x\in[0,1]$ and $N(a,b,c)$ some normalization factor. There are several combinations of parameters $a,b$ and $c$ which generate the same mean and variance. You can use higher moments to fix more than two parameters though.