If I have n independent random variables, such that $S=p_1X_1+p_2X_2+p_3X_3+\cdots+p_nX_n$, where $\sum_{i=1}^np_i=1$. To find the expectation and variance of S, given $\mu_i=E(x_i)$, $\sigma_i=V(x_i)$,
$E(x)=\sum_{i=1}^np_i\mu_i$ and $V(x)=\sum_{i=1}^np_iE(x_i^2)-\sum_{i=1}^np_iE(x_i)^2=\sum_{i=i}^np_i\sigma_i^2$.
There is no need to use the law of total expectation/variance. Am I on the right track? Thank you!
You're close.
For linear combinations of independent random variables $X_i$ with means $\mu_i$ and variances $\sigma^2_i$ we do the following:
If $$S_n = \sum_1^n p_iX_i$$
Then
$$E[S_n]= \sum_1^n p_i\mu_i,\;\; V[S_n] = \sum_1^n p_i^2\sigma_i^2$$