Expectation and Variance of a sum of Independent random variables

58 Views Asked by At

If I have n independent random variables, such that $S=p_1X_1+p_2X_2+p_3X_3+\cdots+p_nX_n$, where $\sum_{i=1}^np_i=1$. To find the expectation and variance of S, given $\mu_i=E(x_i)$, $\sigma_i=V(x_i)$,

$E(x)=\sum_{i=1}^np_i\mu_i$ and $V(x)=\sum_{i=1}^np_iE(x_i^2)-\sum_{i=1}^np_iE(x_i)^2=\sum_{i=i}^np_i\sigma_i^2$.

There is no need to use the law of total expectation/variance. Am I on the right track? Thank you!

2

There are 2 best solutions below

0
On BEST ANSWER

You're close.

For linear combinations of independent random variables $X_i$ with means $\mu_i$ and variances $\sigma^2_i$ we do the following:

If $$S_n = \sum_1^n p_iX_i$$

Then

$$E[S_n]= \sum_1^n p_i\mu_i,\;\; V[S_n] = \sum_1^n p_i^2\sigma_i^2$$

2
On

Since it seems that you have some doubts about the underlying concepts, let me add some details to Annika's answer.

Let $S_n = \sum_{i=1}^n p_i X_i$. Then, since expectation is a linear operator $$ E[S_n] = E\biggl[\sum_{i=1}^n p_i X_i\biggr]=\sum_{i=1}^n p_i E\bigl[X_i\bigr]=\sum_{i=1}^n p_i \mu_i. $$ Note that this is always true, it doesn't matter whether the $X_i$ are independent or not.

Now, as for the variance: \begin{align*} V[S_n] &= E\Bigl[\bigl(S_n - E[S_n]\bigr)^2\Bigr] \\ &= E\Bigl[\bigl(\sum_{i=1}^n p_i X_i - \sum_{i=1}^n p_i \mu_i\bigr)^2\Bigr] &&\text{[by our previous result]}\\ &= E\Bigl[\bigl(\sum_{i=1}^n p_i (X_i - \mu_i)\bigr)^2\Bigr] \\ &= E\Bigl[\bigl(\sum_{i=1}^n p_i (X_i - \mu_i)\bigr)\bigl(\sum_{j=1}^n p_j (X_j - \mu_j)\bigr)\Bigr] \\ &= E\Bigl[\sum_{i=1}^n\sum_{j=1}^n p_i p_j (X_i - \mu_i)(X_j - \mu_j)\Bigr] \\ &= \sum_{i=1}^n\sum_{j=1}^n p_i p_j E\Bigl[(X_i - \mu_i)(X_j - \mu_j)\Bigr] &&\text{[by the linearity of $E$]}. \end{align*}

At this point we have two different cases. If $i=j$, then $$ E\Bigl[(X_i - \mu_i)(X_i - \mu_i)\Bigr] = E\Bigl[(X_i - \mu_i)^2\Bigr]=V(X_i)=\sigma_i^2. $$ In the second case, when $i\ne j$, we finally take advantage of the fact that the $X_i$ are independent to one another - indeed, this allows us to write the expectation of the product as the product of the expectations: \begin{multline} E\Bigl[(X_i - \mu_i)(X_j - \mu_j)\Bigr] = E[X_i - \mu_i]E[X_j - \mu_j] \\ = \bigl(E[X_i] - \mu_i\bigr)\bigl(E[X_j] - \mu_j\bigr) = (\mu_i - \mu_i)(\mu_j - \mu_j)=0. \end{multline}

Wrapping up \begin{align*} V[S_n] &= \sum_{i=1}^n\sum_{j=1}^n p_i p_j E\Bigl[(X_i - \mu_i)(X_j - \mu_j)\Bigr] \\ &= \sum_{i=1}^n p_i^2 E\Bigl[(X_i - \mu_i)^2\Bigr] + \sum_{i=1}^n\sum_{\substack{j=1 \\ j\ne i}}^n p_i p_j E\Bigl[(X_i - \mu_i)(X_j - \mu_j)\Bigr] \\ &= \sum_{i=1}^n p_i^2\sigma_i^2 + \sum_{i=1}^n\sum_{\substack{j=1 \\ j\ne i}}^n p_i p_j \cdot 0 \\ &= \sum_{i=1}^n p_i^2\sigma_i^2. \end{align*}