I'm wondering how, if it is at all possible, to write the p.d.f. for the following random variable.
Given RVs $X_1$ and $X_2$ distributed according to some joint distribution having known density $p(x_1, x_2)$, if we define the random variable $Y = X_1$ w.p. 1/2 and $Y = X_2$ w.p. 1/2.
what is the p.d.f. for $Y$? is it
$$ p(y) = \frac{1}{2}\left(\int p(y,x) dx + \int p(x,y) dx\right)? $$
More generally, if we have RV $X_1,...,X_n \sim p(\cdot, ..., \cdot)$, with $Y$ chosen uniformly from $X_1,...,X_n$ do we have $$ p(y) = \frac{1}{n}\sum_{i=1}^n \int p(x_1,...,x_{i-1}, y, x_{i+1},...,x_n) dx_1...dx_{i-1}dx_{i+1}...dx_n? $$
In the first case, one can formalize the situation saying that $Y=BX_1+(1-B)X_2$ where $B$ is independent of $(X_1,X_2)$ and $P(B=0)=P(B=1)=\frac12$. Then, for every bounded measurable function $u$, $$E(u(Y))=E(u(Y);B=0)+E(u(Y);B=1)=E(u(X_2);B=0)+E(u(X_1);B=1). $$ By independence, the RHS is $$ E(u(X_2))P(B=0)+E(u(X_1))P(B=1)=\frac12(E(u(X_1))+E(u(X_2)))=\int u(y)g(y)\mathrm dy,$$ where $g=\frac12(f_1+f_2)$, assuming that $X_i$ has density $f_i$. Since this holds for every bounded measurable function $u$, $g$ is the PDF of $Y$. Now, identify each $f_i$ as a marginal of $p$.