Simple formula on $X_n^{(k)} = \sum_{1 \le i_1 < ... < i_k \le n} Y_{i_1} \cdot \dots \cdot Y_{i_k}$ (to show $X_n^{(k)}$ is martingale)

141 Views Asked by At

Let $$X_n^{(k)} = \sum_{1 \le i_1 < ... < i_k \le n} Y_{i_1} \cdot \dots \cdot Y_{i_k}$$

If I take $k=2$ and $S_n = Y_1 + \dots + Y_n$ I have of course:

$$X_n^{(2)} = \frac{1}{2} (S_n^2 - \sum_{i=1}^n Y_i^2)$$

My question is, is it possible to find simple formula for any $k$ ?

PS

This question is only part of other task. Whole exercise is to show that $X_n^{(k)}$ is martingale if $(Y_i)_{i \ge 1}$ are iid and $EY_i = 0$. So if $k=2$ it is easy, because I have formula on $X_n^{(2)}$.

2

There are 2 best solutions below

3
On

Hint It follows from the independence of the random variables $(Y_i)_i$ that

$$\mathbb{E}(Y_{i_1} \cdots Y_{i_k} \mid \mathcal{F}_{n-1}) = Y_{i_1} \ldots Y_{i_k}$$

if $i_j \leq n-1$ for all $j=1,\ldots,k$ and

$$\mathbb{E}(Y_{i_1} \cdots Y_{i_k} \mid \mathcal{F}_{n-1}) = 0$$

whenever there exists $j_0 \in \{1,\ldots,k\}$ such that $i_{j_0} = n$ and $i_j \leq n-1$ for all $j \neq j_0$.

5
On

This appears to be a rather nice application of the moments of moments literature. In particular, let $(Y_1, \dots, Y_n)$ denote a random sample of size $n$ drawn on random variable $Y$, with finite mean $E[Y]$. Then, your sum:

$$X_n^{(k)} = \sum_{1 \le i_1 < ... < i_k \le n} Y_{i_1} \cdot \dots \cdot Y_{i_k} \quad = \quad \frac{1}{k!}A_{[1^k]}$$

where $A_{[1^k]}$ denotes an elementary example of an augmented symmetric function.

OP asks:

$$X_n^{(2)} = \frac{1}{2} (S_n^2 - \sum_{i=1}^n Y_i^2)$$

My question is, is it possible to find simple formula for any $k$ ?

The examples you are interested in are known as the elementary symmetric polynomials [ see, for instance, Newton's Identities ], and the conversion you desire is to express them into power sums $s_r=\sum _{i=1}^n Y_i^r$. Here, for example, are the first 5 such conversions:

where I am using the MonomialToPowerSum function from the mathStatica package for Mathematica.

But, importantly, there is no real reason to do the conversion here to power sums, and you may be better off representing your problem in terms of augmented symmetrics. In particular, augmented symmetric polynomials have the property, by the so-called fundamental expectation result, that:

$$E[X_n^{(k)}] \quad = \quad \frac{1}{k!} E\big[A_{[1^k]}\big] \quad = \quad \binom{n}{k}\big(E[Y]\big)^k$$

So, for example,

  • for $k = 2$: $E[X_n^{(k)}] = \frac{1}{2} n (n-1) E[Y]^2$
  • for $k = 3$: $E[X_n^{(k)}] = \frac{1}{6} n (n-1) (n-2) E[Y]^3$
  • for $k = 4$: $E[X_n^{(k)}] = \frac{1}{24} n (n-1) (n-2)(n-3) E[Y]^4$ etc

One can then impose your special case, $E[Y]=0$, as desired.