"Classical" equality for expectations

44 Views Asked by At

I have been self-studying Bayesian decision theory from book by Christian Robert (2007) and ran into this simple looking equality, to which Robert refers to as "classical equality." The equality is as follows:

$$ k_1 \int_{-\infty}^d (d-\theta)\pi(\theta)d\theta + k_2\int_{d}^{\infty}(\theta-d)\pi(\theta)d\theta\\ = k_1 \int_{-\infty}^d P^{\pi}(\theta<y|x)dy + k_2\int_{d}^{\infty} P^{\pi}(\theta>y|x)dy, $$ where $k_1,k_2$ are constants, $d\in\mathbb{R}$, and $\pi$ is a pdf and $P$, presumably, is the CDF associated with $\pi$. Robert says that this equality follows from integration by parts. But it is unclear to me how one obtains this result by integration by parts and where does the "new" variable $y$ comes from.

1

There are 1 best solutions below

1
On BEST ANSWER

This is a common trick in proving the expectation with the Layer Cake Representation, in terms of the CDF:

$$ \begin{align} \int_{-\infty}^d (d - \theta)\pi(\theta)d\theta &= \int_{-\infty}^d \left(\int_{\theta}^d dy\right)\pi(\theta)d\theta \\ &= \int_{-\infty}^d \int_{\theta}^d \pi(\theta) dy d\theta \\ &= \int_{-\infty}^d \int_{-\infty}^y \pi(\theta) d\theta dy \\ &= \int_{-\infty}^d P^\pi(\theta < y \mid x) dy \\ \end{align}$$

The key when exchanging the order of integration with the note of the bound that $-\infty < \theta < y < d$