For a concrete example: Suppose you have a hypergeometric distribution, so a set N objects partitioned into nonempty subsets X and Y. We remove them without replacement. However, rather than simply removing r of them, we instead toss a coin. If it's heads we make a random draw. We then ask the expected number of Xs drawn, call this Z, after r tosses. Take italics X to be the number of elements in X, and N to be the number of all objects.
I just used the law of total expectation to derive the hopefully correct answer.
$$\mathbb{E}Z = \frac{rX}{2N}$$
But it also occurs to me that there is some intuitive appeal to reasoning like so: on average there will be r/2 heads, so this may be equal to the mean of a hypergeometric with that many draws. And this way of thinking would give the same answer!
What I want to know follows. Is there a more general rule by which, when finding the expectation, you can replace components of the system with their expectation? Linearity accomplishes this in a large number of valuable cases. Are there other or more general settings in which we can do this, which would account for the success in the example above?