Estimating number drawn from one distribution based on sum of that number and number drawn from another distribution

176 Views Asked by At

I have been working on this for several days and have been unable to come up with an answer. The problem is very simple to state, but it seems difficult to solve.

A computer draws a number $x$ at random from a uniform distribution between $a$ and $b$. The computer also draws a number $y$ from a normal distribution with mean $m$ and standard deviation $s$. The computer then calculates $z = x + y$. The computer reports $z$ but it does not report $x$ or $y$. Calculate the expected value of $x$ given $z$.

So far, I have noted that the probability distribution function producing $z$ is the convolution of the uniform and normal probability distribution functions producing $x$ and $y$, respectively. And I believe that I need to use Bayes' Theorem to determine the value of $x$ given $z$, but I am stuck soon beyond that point. Any help much appreciated!

1

There are 1 best solutions below

6
On

You don't say explicitly that $X$ and $Y$ are independent random variables but since you use convolution to compute the density of $Z = X+Y$, I expect that you have been told, or are assuming, that $X$ and $Y$ are independent random variables.

Here is one way to approach the problem.

Since $a = 0, b = 1, m = 0$, the joint density of $X$ and $Y$ is $$f_{X.Y}(x,y) = \frac{1}{s\sqrt{2\pi}}\exp(-y^2/2s^2), ~0 \leq x \leq 1, -\infty < y < \infty.$$ From this, figure out the joint density $f_{X,Z}(x,z)$ of $(X, X+Y) = (X,Z)$ using standard methods (Jacobians will be involved) and then $$f_{X|Z}(x|z) = \frac{f_{X,Z}(x,z)}{f_Z(z)} = \frac{f_{X,Z}(x,z)}{\int_{-\infty}^{\infty}f_{X,Z}(x,z)\mathrm dx}.$$ From this, you can calculate $$E[X|Z] = \int_{-\infty}^{\infty} xf_{X|Z}(x|z) \mathrm dx.$$ Alternatively, since conditioned on $X = x$, $Z$ is a Gaussian random variable with mean $x$ and variance $s^2$, $$f_{X|Z}(x|z) = \frac{f_{Z|X}(z|x)f_X(x)}{f_Z(z)} = \frac{1}{f_Z(z)}\cdot\frac{1}{s\sqrt{2\pi}}\exp(-(x-z)^2/2s^2), ~ 0 \leq x \leq 1$$ where now $f_Z(z)$ can be recognized as the constant that makes the function on the right a probability density and easily expressible in terms of the standard Gaussian distribution $\Phi(\cdot)$ or the $\text{erf}$ function. You will need to use the fact that $x\exp(-x^2/2)$ is a perfect integral in computing $E[X|Z]$