Necessary conditions for Wick-Isserlis theorem

748 Views Asked by At

I'm at looking at Isserlis's theorem for 4th order moments

$$E[X_1 X_2 X_3 X_4] = E[X_1 X_2] E[X_3 X_4] + E[X_1 X_3]E[X_2 X_4]+E[X_1 X_4]E[X_2 X_3]$$

The formulation is for normal distribution. Is there a less restrictive necessary condition?

Specifically, I'm finding this equality to give a good fit when applied to empirical moments estimated from data whose distribution is not known, what can I say about the underlying distribution?

1

There are 1 best solutions below

1
On BEST ANSWER

There's a much more general formula which says that for any random variables $X_1, \dots X_n$ such that the relevant joint moments exist, we have

$$\mathbb{E}(X_1 \dots X_n) = \sum_{\pi} \prod_{B \in \pi} \kappa(X_i : i \in B)$$

where $\kappa$ are the joint cumulants, the sum runs over all set partitions $\pi$ of $\{ 1, 2, \dots n \}$, and $B \in \pi$ means $B$ is one of the subsets in the partition. The first cumulant $\kappa(X_i)$ is the expectation, the second joint cumulant $\kappa(X_i, X_j)$ is the covariance, and the higher joint cumulants are "generalized covariances" (e.g. they vanish if their components are independent).

For example, for $n = 4$, and assuming the $X_i$ have mean zero to simplify and remove the corresponding terms, we have

$$\mathbb{E}(X_1 X_2 X_3 X_4) = \mathbb{E}(X_1 X_2) \mathbb{E}(X_3 X_4) + \mathbb{E}(X_1 X_3) \mathbb{E}(X_2 X_4) + \mathbb{E}(X_1 X_4) \mathbb{E}(X_2 X_3) + \kappa(X_1, X_2, X_3, X_4)$$

although $\kappa$ is more or less defined so that this is true so this formula doesn't have much content until you know more facts about cumulants.

From this POV, Isserlis' theorem is equivalent to the statement that if the $X_i$ are jointly Gaussian then all higher joint cumulants past the second ones vanish; this property characterizes Gaussians.