Relative entropy between discrete and continuous random variables

286 Views Asked by At

Is this possible to define relative entropy between discrete and continuous random variables? Say $P$ is a discrete pmf and $Q$ is a continuous pdf, what is $D(P||Q)$?

1

There are 1 best solutions below

3
On

An heuristic: let $P(x)$ be a train of $n$ thin rectangles over points $x_i$ $i=1 \cdots n$, with widths $\delta$ and heights $p_i /\delta$ Then, assuming $Q(x)$ is smooth and positive over the points $x_i$ : $$ D(P||Q)=\int P(x) \log \frac{P(x)}{Q(x)} dx \approx \sum_{i=1}^n p_i \log \frac{p_i}{ \delta Q(x_i)}=\sum_{i=1}^n p_i \log \frac{p_i}{Q(x_i)} - \log{\delta} $$ which tends to $+\infty$ as $\delta \to 0$. (Alternatively, we could also think of Q as a discrete variable with many values.)