Expectation value of the sum of two probability density functions

1.3k Views Asked by At

I am having trouble understand the following (this is NOT homework):

Given, where $f_1(X)$ is a probability density function of $X$: \begin{equation} E_X(X_1)=\int_{-\infty}^{\infty} dX f_1(X) X \end{equation}

And, where $f_2(X)$ is a different probability density function of $X$: \begin{equation} E_X(X_2)=\int_{-\infty}^{\infty} dX f_2(X) X \end{equation}

What is the expected value of $X$ for the sum of the probability functions $E_X(X_{n})$?

Am I correct in the following?: \begin{equation} \begin{split} E_X(X_n)&=\frac{E_X(X_1)+E_X(X_2)}{2} \end{split} \end{equation}

2

There are 2 best solutions below

2
On

The sum of two probability distribution functions is not a probability distribution function, nor is the sum of two probability measures a probability measure. Because they integrate to 2, not to 1. Weighted averages (a.k.a. convex combinations or mixtures) such as $p_1f_1 + p_2f_2$, where $p_1+p_2=1$ with $p_1\ge0, p_2\ge0$ (or in your case, $(f_1+f_2)/2$) are.

They describe what you get with a compound experiment. If a $p_1:p_2$ coin comes up heads, pick from distribution $f_1$, otherwise from $f_2$. Your conjectured formula is correct for the $p_1=p_2=1/2$ case of this. The mixture of two distributions is what a physicist might call a superposition (by analogy with wave functions). The expectation of such a mixture is $p_1EX_1+p_2EX_2$, reflecting the calculus fact that $$ \int x \,(p_1f_1(x)+p_2f_2(x))\,dx = p_1 \int x\, f_1(x)\,dx + p_2 \int x\, f_2(x)\,dx.$$

A linguistically similar but mathematically different thing is the sum of two random variables $X_1+X_2$, where $X_1$ comes from a pick from $f_1$ and $X_2$ comes from a pick from $f_2$. This is what bluemaster's answer is about. The corresponding operation on the distributions is convolution: if the two random variables are independent, the distribution of their sum is $f_1\star f_2$. In any case the expectation of their sum is $E(X_1+X_2)=EX_1 + EX_2$. In the independence case this is reflected in the calculus fact that $$\int\int (x+y)\, f_1(x)f_2(y)\,dxdy = \int x\, f_1(x)\,dx + \int y \,f_2(y)\,dy.$$

There are two distinct concepts here: adding or averaging the values of random variables is not the same thing as adding or averaging their probabilities. It was not at all clear to me from your question (and subsequent comment) which you were asking about, as evidenced by Bluemaster and me interpreting it differently. If this kind of question is important to you, you should learn enough probability theory terminology and notation to make yourself clear when asking questions like this.

0
On

What you want is the expected value of the sum of two random variables, not the expected value of the sum of two probability density functions (associated to the random variables).

For instance, if $Z=X_1+X_2$ it is true that $E(Z)=E(X_1)+E(X_2)$. It is easy to prove that, considering that $f(x_1,x_2)$ is the joint distribution for $X_1$, $X_2$, and $f_1(x_1)$, $f_2(x_2)$ are the marginal distributions for these random variables:

$$E(X_1+X_2)=\int_{\Omega_{X_1}} \int_{\Omega_{X_2}} (x_1+x_2) f(x_1,x_2) d x_1 dx_2$$ $$E(X_1+X_2)=\int_{\Omega_{X_1}} \int_{\Omega_{X_2}} x_1 f(x_1,x_2) d x_1 dx_2+\int_{\Omega_{X_1}}\int_{\Omega_{X_2}} x_2 f(x_1,x_2) dx_1 dx_2$$ $$E(X_1+X_2)=\int_{\Omega_{X_1}} x_1 \int_{\Omega_{X_2}} f(x_1,x_2) d x_2 dx_1+\int_{\Omega_{X_2}}x_2\int_{\Omega_{X_1}} f(x_1,x_2) dx_2 dx_1$$ $$E(X_1+X_2)=\int_{\Omega_{X_1}} x_1 f_1(x_1) d x_1+\int_{\Omega_{X_2}}x_2 f_2(x_2) dx_2$$ $$E(X_1+X_2)=E(X_1)+E(X_2)$$

This result, of course, depends on the convergence of the integrals. It is possible to show cases in which the expected values do not exist.