I have found a "result" that I am somewhat skeptical about. Could someone please check whether the following arguments are sound:
Claim: For any source $X$ with variance $\sigma^2$ the rate distortion function with respect to the squared error metric $E[(\hat{X} - X)^2]$ is given by
\begin{equation} R(D) = \begin{cases} 0 & \text{, if } D > \sigma^2 \\ h(X) - \frac{1}{2} \log(2 \pi e D) & \text{, else} \end{cases} \end{equation}
Proof: The first case if obvious; for $D > \sigma^2$ it suffices to select $\hat{X} = E[X]$. For the second, consider choosing $\hat{X} = X + Z$ where $Z \sim \mathcal{N}(0, D)$. We can then see that the lower bound on the rate distortion function (for a squared error measure) given by $h(X) - \frac{1}{2} \log(2 \pi e D)$ [1] is indeed achieved:
\begin{align} I(X,\hat{X}) &= h(X) - h(X \mid \hat{X}) \\ &= h(X) - h(X \mid X + Z) \\ &= h(X) - h(X + Z - Z \mid X + Z) \\ &= h(X) - h(-Z) \\ &= h(X) - h(Z) \\ &= h(X) - \frac{1}{2} \log(2 \pi e D) \end{align}
[1]: Elements of Information Theory, 2nd Edition, Chapter 10, Exercise 8
Nevermind, the error is obvious:
\begin{equation} h(-Z \mid X + Z) \neq h(-Z) \end{equation}
Clarification: This disproves step 3 -> 4 in the equality chain in the question: $h(X + Z - Z \mid X + Z) = h(-Z \mid X + Z) \neq h(-Z)$.