Let's say I have two independent variables, $X\sim N(10,9)$ and $Y\sim N(5,4)$. $X$ represents the number of orders received in a month, and $Y$ represents the size of each order. For this example, a store on average receives $10$ orders/month of an average order size of $5$ widgets/order. The average number of widgets sold per month is $50 (10X5)$, but I want to know how to calculate the variance of widgets sold per month, which is not equal to the variance of $XY (661)$. Using a simulation of 2000 months, I came up with an empirical variance of $\sim 265$. Please tell me how I can calculate this figure formulaically. Thank you!
Edit: Just to clarify, the number of widgets sold per month is a third random variable, $Z\sim N(50,?)$. I want to be able to calculate the variance of $Z$ from what I know of $X$ and $Y$.
As there is a law of total expectation, so there exists a law of total variance. We will call the random variable representing the total number of widgets ordered in a month $W$. Note that $$W = Y_1 + Y_2 + \ldots, Y_X,$$ because if we observe $X$ orders in a month, that means we observed orders $Y_1, Y_2, \ldots, Y_X$ of widgets in each order.
So, the expected value of $W$ is $$\operatorname{E}[W] = \operatorname{E}[\operatorname{E}[W \mid X]] = \operatorname{E}[X \mu_Y] = \mu_X \mu_Y,$$ as you would expect. But the variance calculation uses the formula
$$\operatorname{Var}[W] = \operatorname{Var}[\operatorname{E}[W \mid X]] + \operatorname{E}[\operatorname{Var}[W \mid X]].$$ Using this, we obtain $$\begin{align*} \operatorname{Var}[W] &= \operatorname{Var}[X \mu_Y] + \operatorname{E}[X \sigma_Y^2] \\ &= \sigma_X^2 \mu_Y^2 + \mu_X \sigma_Y^2. \end{align*}$$ This is because, for example, $\operatorname{Var}[W \mid X]$ is the conditional variance of $W$ given $X$; that is to say, we regard $X$ as some fixed value with respect to the variance, and since there are $X$ such iid $Y_i$s in $W$, this conditional variance is simply $X \sigma_Y^2$ (using the formula $\operatorname{Var}[Y_1 + \cdots + Y_n] = n \operatorname{Var}[Y]$). As a check, substituting the given values in your example results in $\operatorname{Var}[W] = 9(5^2) + 10(4) = 265,$ as you found empirically.
Note that it is not obvious that $W$ is itself normal. The law of total expectation and the law of total variance applies even when the distributions are not normal, and it says nothing about the normality of $W$. If $X$ were fixed, then the conditional distribution $W \mid X$ is indeed normal, as it is the sum of a fixed number of iid random variables $Y_1, \ldots, Y_X$. But if $X$ is random, why should the unconditional distribution $W$ also be normal? This is not obvious at first glance and you should be careful in making such a claim.
To see if $W$ is indeed normal, it suffices to integrate the conditional distribution $W \mid X$ over $X$: that is to say, $$f_W(w) = \int_{x=-\infty}^\infty f_{W \mid X}(w,x) f_X(x) \, dx.$$ We know that $$W \mid X \sim \operatorname{Normal}(X\mu_Y, X\sigma_Y^2),$$ with density $$f_{W \mid X}(w,x) = \frac{1}{\sqrt{2\pi} x \sigma_Y^2} e^{-(w - x\mu_Y)^2/(2 x \sigma_Y^2)}.$$ But here we have now exposed a fundamental problem: because $X$ is normal, it has support on $(-\infty, \infty)$, thus the variance of $W$ is nonsensical when $X < 0$! This is the giveaway that tells us that, in general, $W$ cannot be normal, even ignoring how $W$ as we have defined it is the "sum" of a possibly non-integer number of $Y_i$s (since $X$ is a continuous random variable). Nevertheless, we can still calculate the moments of $W$ even if we don't have an explicit distribution for it, as we have shown above.