The Kullback_Leibler Divergence is a measure of how one distribution differs from another.
For distributions $P$ and $Q$ of a continuous random variable, the K-L divergence is $$D_{KL}(P||Q) = \int_{-\infty}^{\infty}p(x) \log\left(\frac{p(x)}{q(x)} \right)dx, $$ where $p(x)$ and $q(x)$ are the probability densities of $P$ and $Q$, respectively.
I am trying to obtain a linearized version of the K-L Divergence, where the distribution $Q$ is fixed and treating $D_{KL}$ as a function of $P$; call this function $D_{KL}(P)$. I wish to then linearize this around $P=Q$.
From the standard formula, the linearization would take the form $$L_{D_{KL}}(P) = D_{KL}(Q) + \frac{dD_{KL}}{dP}(Q)[P-Q]. $$
Obviously the first term is $D_{KL}(Q)=0$, but I am not sure how to deal with the derivative term or if the normal linearization works in this case.