I have distribution that can be defined as below,
$S=a_0\cdot b_0 + a_1\cdot b_1 + a_2\cdot b_2 + \cdots +a_{n-1}\cdot b_{n-1}$
Now, I want find the distribution of $S$ when, $a_i$'s are selected from a certain distribution with standard deviation $\sigma$ (for simplicity we can assume it a Gaussian distribution). And $b_i$'s can be $+5$ with probability $p$ and $-5$ with probability $1-p$. How, as far as I know $S$ will be distributed normally too, but what will be the standard deviation of such distribution.
$S$ will not be normally distributed, but no matter
If everything is independent then