Expectation of $X^2$ where $X = x_1 + x_2 + x_ 3+ \dots +x_k$ with $P(x_i = 0) = 1-p$ and $P(x_i = x) = p$

49 Views Asked by At

We are told that $X = x_1 + x_2 + x_ 3+ \dots +x_k$ and that the $x$'s are independent. The probability of $x_i = x$ is $p$, while the probability that $x_i = 0$ is $1-p$. Use this to find $E(X^2)$.

The first thing I noticed was that if $P(X = x)$ that's just like saying what's the probability that only one of the $x_i$ is equal to $x$. So this makes me think it's binomial distribution. If that is the case $E(X)$ is just the sum of binomial distributions. However, I am not sure how you can handle $E(X^2)$ as $E(X^2) = E((x_1 + x_2 + x_ 3+ \dots +x_k)^2)$.

Any suggestion?

2

There are 2 best solutions below

0
On BEST ANSWER

It is possible to perform the calculation directly.

First, we need to observe that for each $x_i$, we have $x_i^2 = x_i$, because if $x_i = 1$, then $x_i^2 = 1$; and if $x_i = 0$, then $x_i^2 = 0$. Then

$$\begin{align} \operatorname{E}[X^2] &= \operatorname{E}\,\left[\left(\sum_{i=1}^k x_i \right)^2\right] \\ &= \operatorname{E}\left[ \sum_{i=1}^k \sum_{j=1}^k x_i x_j \right] \\ &= \operatorname{E}\left[ \sum_{i=1}^k x_i^2 + \sum_{i \ne j} x_i x_j \right] \\ &= \sum_{i=1}^k \operatorname{E}[x_i^2] + \sum_{i \ne j} \operatorname{E}[x_i x_j] \\ &= \sum_{i=1}^k \operatorname{E}[x_i] + \sum_{i \ne j} \operatorname{E}[x_i] \operatorname{E}[x_j] \\ &= kp + k(k-1)p^2 \\ &= kp(kp - p + 1). \end{align}$$

The first step is to write $X$ as the sum of the $x_i$. The second simply expands the square by writing it as a double sum. The third step separates the terms for which $i = j$, of which there are $k$ such terms, versus $i \ne j$, of which there are $k(k-1)$ such terms. The fourth step is simply the linearity of expectation. The fifth step uses the fact $x_i^2 = x_i$ we explained earlier, and the pairwise independence of the $x_i$. The rest is just algebra.

0
On

Here's my understanding of what you've asked, using slightly different notation to make things a bit clearer. Let $X_1, ..., X_k \sim \text{iid}$ with $P(X_i = 0) = 1-p$ and $P(X_i = c) = p$ for some constant $c$. We are asked to calculate $E[(X_1 + ... + X_k)^2]$.

This problem becomes simpler if we define $Z_1, ..., Z_k \sim \text{iid Bernoulli}(p)$. Note that $X_i$ is equal in distribution to $cZ_i$. Thus, we have: $$E[(X_1 + ... + X_k)^2] = E[(cZ_1 + ... + cZ_k)^2] = c^2 E[(Z_1 + ... + Z_k)^2].$$ Now, since the $Z_i$ are iid Bernoulli$(p)$, it follows that $Z_1 + ... + Z_k \sim \text{Binomial}(k, p)$. The variance of a Binomial$(k,p)$ is $kp(1-p)$ while the mean is $kp$. Thus: $$E[(Z_1 + ... + Z_k)^2]= Var[(Z_1 + ... + Z_k)] + (E[Z_1 + ... + Z_k])^2 = kp(1-p) + (kp)^2$$ and it follows that $$E[(X_1 + ... + X_k)^2] = c^2[kp(1-p) + (kp)^2]$$