When the mean of a non-negative, integer-valued random variable goes to zero, does this imply anything about the other raw moments?

97 Views Asked by At

When the mean of a non-negative, integer-valued random variable $X_n$, for example counting paths in a random graph on $n$ vertices between two fixed vertices, goes to zero,

$$\lim_{n \to \infty}\mathbb{E}[X_n] =0$$

can we have $\lim_{n \to \infty}\mathbb{E}[X_n^3] = \infty$, or under what conditions does this occur? This seems reasonable given we just need two different sequences to have different limits, but a bit counter-intuitive, since we're counting objects, and the raw moments give the typical number of ordered pairs, triples, etc.

3

There are 3 best solutions below

6
On BEST ANSWER

Sure. Start with an example that has a finite first moment and an infinite third moment, such as $X$ with probability function proportional to $\frac{1}{x^3}$ for $x \geq 1$. Then, modify it just a bit: let $X_n$ have probability function proportional to $\frac{1}{x^3 \log(x)^n}$ (let's say, restrict to $x \geq 3$ so we can guarantee $\log(x) > 1$).

The first moment will tend to zero, as $\sum_{x=3}^{\infty} \frac{x}{x^3 \log(x)^n} < \frac{1}{\log(3)^n} \frac{\pi^2}{6} \to 0$, but the third moment will never exist, as $\sum_{x=3}^{\infty} \frac{x^3}{x^3\log(x)^n}$ diverges.

You can also adjust the moment at which this occurs by changing the power on $x$ in the denominator; changing it from $3$ to $5$ will guarantee existence of the third moment but not the fourth, for instance.


EDIT: I'd like to thank A rural reader for pointing out a flaw in my argument: by sweeping the normalization constant under the rug, I think my sequence doesn't actually do what it's supposed to do, because that normalization constant grows with $n$, and I believe it may do so fast enough to prevent $\mathbb E[X_n] \to 0$. Instead, let me choose a much simpler example that's more in the spirit of William M's comment:

Let $X_n$ have the following distribution: $$X_n = \begin{cases} n^2, & \text{w/ prob. } \frac{1}{n^3} \\ 0, & \text{w/ prob. } 1 - \frac{1}{n^3} \end{cases}.$$ Then $\mathbb E[X_n] = \frac{1}{n}$, while $\mathbb E[X_n^3] = n^2$.

0
On

If $\mathbb{E}(X_n)\to0 $ then it is totally possible for $\mathbb{E}\left(X_n^3\right)\to\infty$ as $n\to\infty$. The $X_n$'s may have finite mean but their third moments may be infinite, for example. In such a case, the limit of the third moments will be infinity (aka does not exist).

Even in the case that $X_n$'s are random variables supported on a finite set, $\mathbb{E}(X_n)\to 0$ does not imply convergence to zero of any higher moments.

2
On

Here is a concrete example about counting things in random graphs.

Let $H$ be the $5$-vertex graph obtained by adding an edge to $K_4$:

enter image description here

Suppose that $X_n$ counts the number of copies of $H$ in $G_{n,p}$, for some carefully chosen $p$. Then $\mathbb E[X_n] = \Theta(n^5 p^7)$. So if we choose $p = o(n^{-5/7})$, then $\mathbb E[X_n] \to 0$ as $n \to \infty$.

However, for such values of $p$, we have $\mathbb E[X_n^2] = \Theta(n^6 p^8)$: $X_n^2$ counts ordered pairs of copies of $H$, and the $\Theta(n^6 p^8)$ term comes from two copies that share $K_4$ but have different edges coming out of the $K_4$. Whenever $p$ is (asymptotically) between $n^{-3/4}$ and $n^{-5/7}$, such as $p = n^{-0.74}$, we have $\mathbb E[X_n] \to 0$ but $\mathbb E[X_n^2] \to \infty$ as $n \to \infty$.

Similarly, there is a $\Theta(n^7 p^9)$ contribution to $\mathbb E[X_n^3]$ that comes from three copies of $H$ that all share the same $K_4$. So if we take a value of $p$ such as $n^{-0.77}$, then we get $\mathbb E[X_n] \to 0$, $\mathbb E[X_n^2] \to 0$, and $\mathbb E[X_n] \to \infty$ as $n \to \infty$.

Essentially, when you see a situation where the mean goes to $0$ but higher moments go to infinity, this tells you that $X_n$ counts objects that rarely appear, but appear in bunches when they do. That's what we see here: it is rare to find a copy of $H$ in $G_{n,p}$ for this range of $p$, but if you find one copy, you find many copies.