Conditional probability textbook question: Let $L(\cdot)$ be a scalar function

30 Views Asked by At

Here is the problem:

Let $L(\cdot)$ be a scalar function, with $L(0)=0, L(y) \geq L(z)$ for $\Vert y\Vert \geq \Vert z \Vert, L(y) = L(-y),$ and with $L(\cdot)$ convex. Let $p_{X|Y}(x|y)$ be symmetric about $\hat x = E\{X\mid Y=y\}$. Prove that for all $z$, $$E\{L(X-\hat x)\mid Y=y\}\leq E\{L(X-z)\mid Y=y\}$$

Here is a hint provided by the textbook:

[Hint: Set $\tilde x = x- \hat x, \tilde z = z - \hat x$, and show that $$E\{L(X-z)\mid Y=y\} = \int L(\tilde z - \tilde x)p_{\tilde X|Y}(\tilde x | y )d\tilde x = \int L(\tilde z + \tilde x)p_{\tilde X|Y}(\tilde x | y )d\tilde x $$

$$= \int\frac{1}{2}[L(\tilde z-\tilde x) + L(\tilde z + \tilde x)]p_{\tilde X|Y}(\tilde x | y)d\tilde x $$

Then use the evenness and convexity of $L(\cdot).$]

Now, I can use the hint provided to get to the desired expression, but I'm having trouble understanding the equality of the first two integrals in the hint. Since $p_{\tilde X |Y}$ is symmetric about $\hat x$, $p_{\hat X | Y} (\tilde x) =p_{\tilde X|Y }(-\tilde x|y)$ (where $\tilde x$ is the difference between $x$ and the conditional mean $\hat x$ as it is defined). But I don't see that $L(\tilde z - \tilde x) = L (\tilde z + \tilde x)$, and as far as I can tell $\int_{-\infty}^{\infty}L(\tilde z-\tilde x)d\tilde x \neq \int_{-\infty}^{\infty} L(\tilde z + \tilde x)d\tilde x$ even with messing around the limits of integration. So, my question is, why is

$$\int L(\tilde z - \tilde x)p_{\tilde X|Y}(\tilde x | y )d\tilde x = \int L(\tilde z + \tilde x)p_{\tilde X|Y}(\tilde x | y )d\tilde x$$

true?

This problem is from Optimal Filtering by Anderson and Moore, page 33-34 problem 3.6.