In statistical inference by Casella and Berger, as an example of the fact that uncorrelated random variables are not necessarily independent, the authors show that if $X\sim f(x-\theta)$ where $f$ is symmetric about $0$ and $Y=I(|X-\theta|<2)$ then $E(XY)=E(X)E(Y)$.
For this they claim that $$E(XY)=\int_{-\infty}^{\infty}xI(|x-\theta|<2)f(x-\theta)dx$$ I don't quite understand how they arrive at this expression. What is the joint pdf of $(X,Y)$ in this case?
I am going to assume that $f_X(x) = f(x-\theta)$. By definition
$$Y=\left\{ \begin{array}{ll} 1 & \lvert X-\theta \rvert \leq 2\\ 0 & \text{otherwise} \end{array} \right . $$
and
\begin{align*} E[XY] &= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} xyf_{X,Y}(x,y)dxdy\\ &= \int_{-\infty}^{\infty} xf_{X,Y}(x,1)dx \tag{1}\\ \end{align*}
We also have that
$$f_{X,Y}(x,1) = f_{X\mid Y}(x\mid 1)P(Y=1) \tag{2}$$
Let's first compute $F_{X\mid Y}(x\mid 1)$:
\begin{align*} F_{X\mid Y}(x\mid 1) &= P(X \leq x \mid Y=1)\\ &= \left\{ \begin{array}{ll} \frac{F_X(x) - F_X(-2+\theta)}{P(Y=1)} & \lvert x-\theta \rvert < 2\\ 1 & \text{otherwise} \end{array} \right . \\ \end{align*}
Then
\begin{align*} f_{X\mid Y}(x\mid 1) &= \frac{dF_{X\mid Y}(x\mid 1)}{dx}\\ &= \left\{ \begin{array}{ll} \frac{f_X(x)}{P(Y=1)} & \lvert x-\theta \rvert < 2\\ 0 & \text{otherwise} \end{array} \right . \\ \end{align*}
Therefore, putting all together, (2) becomes $f_{X,Y}(x,1) = f_X(x)I(\lvert x-\theta \rvert < 2)$. Replacing this in (1) give us the desired result.