Generation of random variables given that the PDF is a linear combination

188 Views Asked by At

I am currently trying to generate samples of a random variable $\mathcal X$, with PDF denoted as $f_{\mathcal X}(x)$. Suppose that it is comparatively complicated to generate the samples of $\mathcal X$, and it is given that $$f_{\mathcal X}(x) = \sum_{i = 1}^{M} \mu_i f_{\mathcal Y_i}(x),$$ where $\mu_i \in \mathbb R^+$, $\sum_i \mu_i = 1$ and $f_{\mathcal Y_i}(x)$ denotes the PDF of a random variable $\mathcal Y_i$, for which it is relatively straightforward to generate the samples.

How can I generate the samples of $\mathcal X$, given that I already have the samples of $\mathcal Y_i$?

I have tried taking the linear combination of samples, but that is not working. I realized that simple scaling and addition of random variables will result in a convolution of their PDFs (which is not the case here).

Any hints?

1

There are 1 best solutions below

4
On BEST ANSWER

In order for the question to make sense, it must hold that $$ \mu_1 + \cdots + \mu_M = 1. $$ This is called a convex combination, in contrast to a general linear combination.

Anyway, generating a sample for $X$ is simple: first pick the number $i$ with probability $\mu_i$ for $i=1,\ldots,M$, then if $i$ was chosen, set $X=Y_i$.

You can see why this works by looking at the law of total probability, which states that $$ P\left(B\right) = \sum_i P\left(B\mid A_i\right) P\left(A_i\right) $$ whenever $A_1,A_2,\ldots$ are disjoint events with positive probability such that $$ P\left(A_1\right) + P\left(A_2\right) + \cdots = 1.$$