Given a function, calculate MMSE and LMMSE

2.7k Views Asked by At

Let $X = \frac{1}{1+U}$ where $U$ is uniformly distributed over $[0,1]$. I need to evaluate $E[X\mid U]$ and $\hat{E}[X\mid U]$ and the calculate the MSE, $E[(X-E[X\mid U])^2]$ and $E[(X-\hat{E}[X\mid U])^2]$


I know that, in general, the pdf of a uniform distribution is $\frac{1}{b-a} \in [a,b]$ and the mean is $\frac{a+b}{2}$.

In general, the minimum mean square error estimator is simply the conditional mean, \begin{align} E[X\mid Y=y] &= \int x f_{X\mid Y}(x\mid y) \, dx \\ f_{X\mid Y}(x\mid y) &:= \frac{f_{XY}(x,y)}{f_Y(y)}\\ f_Y(y) &= \int_{-\infty}^\infty f_{XY}(x,y) \, dx \end{align}

In general, the least linear minimum mean square error (LMMSE) estimator is defined as \begin{align} \hat{E}[X\mid Y=y] &= \mathbb E[X] + \operatorname{Cov}(X,Y)\operatorname{Cov}(Y)^{-1}(y-E[Y]) \end{align}


I am having problems formulating the problem function, $X = \frac{1}{1+U}$, in terms of the joint and conditional pdf.

2

There are 2 best solutions below

7
On BEST ANSWER

Since $X = \displaystyle \frac{1}{1+U}$, the conditional expectation $E[X\mid U = \alpha]$, the expected value of $X$ given that $U = \alpha$, is the expected value of $\displaystyle \frac{1}{1+U}$ given that $U = \alpha$, and is thus just $\displaystyle \frac{1}{1+\alpha}$. Thus, $$E[X \mid U] = \frac{1}{1+U}$$ is the MMSE estimator for $X$ given $U$. This varies from $1$ when $U = 0$ to $\frac{1}{2}$ when $U = 1$.

For the linear minimum-mean-square-error (LMMSE) estimator, you need to find $E[X]$ which is just $$E[X] = E[E[X \mid U]] = E\left[\frac{1}{1+U}\right] = \int_{-\infty}^\infty \frac{1}{1+u}f_U(u)\,\mathrm du = \int_0^1 \frac{\mathrm du}{1+u}$$ whose value you should work out for yourself.

Write down $\displaystyle E[X] = \int_0^1 \frac{\mathrm du}{1+u} = \cdots \quad$ after computing the integral shown above and putting its value where I have written $\cdots$. Draw a box around this so you can find the numerical value of $E[X]$ again easily. You will need it in the future.

Next, $$\operatorname{cov}(X,U) = E[XU] - E[X]E[U] = E\left[\frac{U}{1+U}\right] - E[X]E[U]$$ where all the quantities on the right are readily computed.

Repeat slowly three times:

  1. I can compute $E\left[\frac{U}{1+U}\right]$ using the law of the unconscious statistician as $$E\left[\frac{U}{1+U}\right] = \int_{-\infty}^{+\infty} \frac{u}{1+u}f_U(u)\,\mathrm du = \int_0^1 \frac{u}{1+u}\,\mathrm du = \bigr[u - ln(1+u)\bigr|_0^1 = 1 - \ln(2).$$

  2. I do not need to compute $E[X]$ again because I already found its value and I have saved it for future use.

  3. I will not write $E[X] = \frac{1}{1+U}$ (as I did in the comments) and needlessly confuse myself because of #2 above. I already know the numerical value of $E[X]$, and I also understand that this real constant cannot possibly equal $\frac{1}{1+U}$ which is a random variable.

  4. I already know that $E[U] = \frac{1}{2}$ and so I don't need to find it again.

Now, compute $\operatorname{cov}(X,U) = E\left[\frac{U}{1+U}\right] - E[X]E[U]$ where the three expectations on the right have known numerical values that you have just computed. Still doesn't work? Carry out the instructions in the highlighted text above one more time.

In order to compute the LMMSE estimator, you will also need $\operatorname{var}(U)$ which I hope you can also compute easily (or use a standard formula) to arrive at the answer $\frac{1}{12}$.

Now put it all together. You should get that the LLMSE estimator is a straight line $au+b$ of negative slope that intersects the hyperbola $\frac{1}{1+u}$ (the MMSE estimator) in two places.

1
On

First, derive the marginal density of $X$, which by the change-of-variable formula can be seen to be

$$f_X(x) = x^{-2} \qquad x\in \left[\frac 12,\; 1\right]$$

and zero elsewhere. The support has been calculated from the functional form for $X$ and from the fact that $U \in [0,1]$, and you can verify that this is a proper pdf.

Now what does it mean "the distribution of $X$ given $U$"? At cumulative distribution function level, this would be expressed as

$$F_{X|U}(x|u) = P(X\le x\mid U\le u) $$ Namely, the probability of $X$ being smaller than some value $x$, given that $U$ is smaller than some value $u$. So we have $$U\le u \Rightarrow 1+ U \le 1+u \Rightarrow X = \frac{1}{1+U} \ge \frac{1}{1+u}$$

The effect of conditioning on $U$ is that the lower bound of $X$ now depends on $U$. In the unconditional case, we calculated the lower bound for the support of $X$ by considering the maximum value $U$ can take, i.e. unity. Now the maximum value $U$ can take is some value $u$. It is as though $X$ is a function of a uniform random variable that ranges in $[0,u]$, with density $\frac 1u$. Applying for this case the change-of-variable formula, we obtain

$$f_{X|U}(x|u) = \frac 1ux^{-2} \qquad x\in \left[\frac{1}{1+u},\; 1\right]$$

I guess you can take it from here.