Consider $X_i$ to denote the numerical value for the i-th event. Assume it is IID.
Say we want to compute
$$ c = E\left[\sum_{i=1}^NX_i\right] $$
$N$ is a random variable. You can think of it as represent the number of events needed to get some constant sum $c$. My question is, can the above be simplified in the following manner?
$$ c = \sum_{i=1}^NE[X_i] \\ = NE[X_i] $$
Since the $X_i$ are i.i.d., say they have mean $\mathbb E[X_i]=\mu$. For a fixed $n$ (not random) we would get $\mathbb E\left[\sum_{i=1}^nX_i\right]=n\mu$ by linearity. Now, suppose $N$ is random (under some sort of well-defined distribution), then I'm going to assume that
Under these assumptions, $$ \mathbb E\left[\sum_{i=1}^NX_i\right] = \sum_{n=1}^\infty\mathbb E\underbrace{\left[\sum_{i=1}^nX_i\right]}_{=n\mu}\operatorname{Pr}(N=n) = \sum_{n=1}^\infty n\operatorname{Pr}(N=n)\cdot\mu = \mathbb E[N]\cdot\mu $$ In the first and last step, I used the definition of expectation (i.e., $\mathbb E[f(X)] = \sum_{x\in\Omega}f(x)\operatorname{Pr}(X=x)$ where $\Omega$ is the space of possible values of $X$).