Calculating factorial moment for distributions

10.8k Views Asked by At

I don't really understand how to calculate factorial moment for distributions besides just looking at the given formulas in my textbook. So say I want to calculate the E[X(X-1)] factorial moment for Poisson and Binomial distribution:

For Binomial distribution, we can see that E[X(X-1)]=$E[(X)_{2}]$=E[x!/(x-2)!}= sum of p(x) x!/(x-2)!. Okay, what should I do next?

I'm not entirely sure if there is any other changes for Poisson distribution... but seeing that the end result is a $\lambda$ makes me wonder what happens in between...

2

There are 2 best solutions below

1
On

It is not clear what you wanted to express by

$$E[(X)_{2}]=E[x!/(x-2)!].$$

By the definition of the $r^{th}$ factorial moment of a random variable $X$ is

$$E[X(X-1)(X-2)\cdots (X-r+1)].$$

We assume that the expectation in question exists.

In the case of the binomial distribution with parameters $n,p$ the $2^{d}$ factorial moment, by definition is

$$E[(X)_2]=E[X(X-1)].$$

To compute this quantity we do a simple calculation of an expectation, that is

$$E[X(X-1)]=\color{red}{\sum_{k=0}^n k(k-1){n\choose k}p^k(1-p)^{n-k}}=\frac{n!}{(n-2)!}p^2.$$ ($n\ge 2$.) If $r=2$ then

$$\frac{n!}{(n-2)!}p^2=\frac{1\cdot2\cdot3\cdots (n-2)(n-1)n}{1\cdot1\cdot2\cdot3\cdots(n-1)(n-2)}p^2=(n-1)np^2=n^2p^2-np^2.$$ The red part is a well defined calculation. I looked up the result in wiki.

But we could do the calculation on our own if we notice that

$$E[X(X-1)]=E[X^2]-E[X],$$

that is the second factorial moment equals the second moment minus the expectation. In the binomial case the expectation is $np$ and the variance is $\sigma^2=np(1-p)$. We know that $\sigma^2=E[X^2]-E^2[X]$. Hence
$$E[X^2]=np(1-p)+n^2p^2,$$

So

$$E[(X)_2]=E[X(X-1)]=np(1-p)+n^2p^2-np=n^2p^2-np^2.$$

The result for the Poisson distribution can be found in wiki too. In that case the calculation is

$$\sum_{k=0}^{\infty}k(k-1)\frac{\lambda^ke^{-\lambda}}{k!}=\lambda^r.$$

For the second factorial moment you can do the direct calculation based on the facts that the expectation is $\lambda$, the variance is also $\lambda$ so the second moment, $E[X^2]=\lambda+\lambda^2$. So,

$$E[(X)_2]=E[X^2]-E[X]=\lambda^2.$$

I hope that you can look at the $r^{th}$ factorial moment as if it was simply a calculation of the expected value of a function of a random variable.

0
On

The posted answer states that you can directly calculate the $r$th factorial moment as a polynomial of individual moments of $X^k$. While the suggested approach gives correct results when it is possible to calculate, it makes it seem as though the factorial moments exist as a exercise in tedious calculation and not much more.

Certainly this is not the case and the use of factorial moments simplifies calculations. However, as with any tool you need to know how to use the tool in order to get the benefit. For factorial moments there are several ways to use that tool. The binomial distribution illustrates some of the use of the factorial moment as a tool for simplification of calculations.

The two things to recognize about the factorial moment here are:

(i) $(X)_k (X-k)! = X!$

and

(ii) $\sum_{x\geq 0} (X)_k \Pr[X=x] = \sum_{x\geq k} (X)_k \Pr[X=x]$

Let's call these peeling one and peeling two properties. They each 'peel' part of a calculation off from the full calculation. In the first case the peeling is of a factorial and the second is of a-potentially infinite-sum.

Now consider the expectation:

$$\mathbb{E}\left[(X)_k\right] = \sum_{x\geq 0} {n \choose{x}}(x)_k p^xq^{n-x},$$

where we've written $q=1-p$ for brevity.

First write out the combination in terms of factorials so we can use peeling one, and lop off the first $k$ terms via peeling two. This gives us,

$$\sum_{x\geq 0} {n \choose{x}}(x)_k p^xq^{n-x}=\sum_{x\geq k}\frac{n!}{(n-x)!x!}(x)_k p^xq^{n-x}.$$

Now we apply peeling one and also do a plus/minus $k$ in the exponent of $p$, to yield,

$$\sum_{x\geq k}\frac{n!}{(n-x)!x!}(x)_k p^xq^{n-x}=\sum_{x\geq k}\frac{n!}{(n-x)!(x-k)}p^{x+k-k}q^{n-x}.$$

Next use peeling one on $n!$ to peel it down by $k$ terms and factor out the terms $(n)_k$ and $p^k$. We want these outside because we are summing over values of $x\geq k$ so $k$ is really thought of here as a fixed value. this gives us

$$\sum_{x\geq k}\frac{n!}{(n-x)!(x-k)}p^{x+k-k}q^{n-x}=p^k(n)_k\sum_{x\geq k}\frac{(n-k)!}{(n-x)!(x-k)}p^{x-k}q^{n-x}.$$

Now consider re-indexing the sum from $x\geq k$ to $z=x-k \geq 0$, the previous sum-over $x$ had non-zero terms up to $n$, this re-indexing shifts the indices to start at $0$ and go until $n-k$. Doing so and staring at the formula inside the sum you'll notice this is a new binomial sum, from $0$ to $n-k$ and therefore sums to $1$ for the same reason that a binomial distribution sums to $1$ when summed from $0$ to $n$.

$$p^k(n)_k\sum_{x\geq k}\frac{(n-k)!}{(n-x)!(x-k)}p^{x-k}q^{n-x} =p^k(n)_k\underbrace{\sum^{n-k}_{z\geq 0}\frac{(n-k)!}{(n-k-z)!z!}p^{z}q^{n-k+z}}_{=1}.$$

So we see that the formula for a binomial factorial moment is $\mathbb{E}[(X)_k] = (n)_kp^k$.

Also, you can show that the Poisson distribution has $\mathbb{E}[(X)_k] = \lambda^k$ by similar arguments, the re-indexing of the sum is simplified in the Poisson case somewhat because the sum there is infinite so you don't need to be as careful with the upper bound.