Strange statistic

116 Views Asked by At

Let $N$ be a random integer and let $p_N(n)$ be its discrete density. The statistic \begin{equation}S(N)\triangleq\sum_{n=2}^\infty p_N(n)\,n\,(n-1)\end{equation} has some specific meaning? It is clear that in general it is not the variance of $N$, but it is instead \begin{equation}S(N)=\mathbb{E}[N^2]-\mathbb{E}[N]\end{equation} where $\mathbb{E}[N]$ denotes the expected value of $N$. In fact \begin{equation}\begin{aligned}S(N)&=\sum_{n=2}^\infty p_N(n)\,n^2 -\sum_{n=2}^\infty p_N(n)\,n\\ &=\left(\sum_{n=0}^\infty p_N(n)\,n^2-p_N(1)\right)-\left(\sum_{n=0}^\infty p_N(n)\,n-p_N(1)\right)\\ &=\sum_{n=0}^\infty p_N(n)\,n^2-\sum_{n=0}^\infty p_N(n)\,n\\ &=\mathbb{E}[N^2]-\mathbb{E}[N]\end{aligned}\end{equation} The statistic $S(N)$ is the variance only if $N$ has expected value equal to zero.

1

There are 1 best solutions below

2
On

\begin{align} & \text{The $k$th} \textbf{ factorial moment } \text{of the} \\ & \text{distribution of the random variable $X$} \\[10pt] = {} & \operatorname E\big( \,\underbrace{X(X-1)(X-2)\cdots(X-k+1)}_\text{$k$ factors} \, \big) \end{align}

The only reason that I know of (but I am ignorant of many things) why this is of interest is that it can be neatly computed in cases where $X$ has a Poisson distribution, and then the moments of the Poisson distribution may be computed by applying some algebra after that.

\begin{align} & \text{Suppose } X \sim\operatorname{Poisson}(\lambda). \text{ Then} \\[10pt] & \operatorname E\big( X(X-1)(X-2)\cdots(X-k+1) \big) \\[8pt] = {} & \sum_{x\,=\,0}^\infty x(x-1)(x-2)\cdots(x-k+1) \Pr(X=x) \tag 1 \\[8pt] = {} & \sum_{x\,=\,k}^\infty x(x-1)(x-2)\cdots(x-k+1) \Pr(X=x) \\ & \text{because the sum of the first $k$ terms on line $(1)$ above is 0} \\[8pt] = {} & \sum_{x\,=\,k}^\infty x(x-1)(x-2)\cdots(x-k+1)\cdot\frac{\lambda^x e^{-\lambda}}{x!} \\[8pt] = {} & \sum_{x\,=\,k}^\infty \frac{\lambda^x e^{-\lambda}}{(x-k)!} = \sum_{y\,=\,0}^\infty \frac{\lambda^{y+k} e^{-\lambda}}{y!} \quad \text{where } y = x-k \\[8pt] = {} & \lambda^k \sum_{y\,=\,0}^\infty \frac{\lambda^y e^{-\lambda}}{y!} \qquad = \lambda^k. \end{align}

Proposition: The first $n$ moments of the distribution of the number of fixed points of a uniformly distributed random permutation of the set $\{1,\ldots,n\}$ are the same as the first $n$ moments of the Poisson distribution with expected value $1.$

Proof: The first $n$ factorial moments are the same; therefore the first $n$ moments are the same. $\qquad\blacksquare$